[ 571.913726] env[59615]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 572.361929] env[59659]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 573.889822] env[59659]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=59659) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 573.890184] env[59659]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=59659) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 573.890238] env[59659]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=59659) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 573.890525] env[59659]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 573.891606] env[59659]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 574.007630] env[59659]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=59659) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 574.018036] env[59659]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=59659) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 574.118220] env[59659]: INFO nova.virt.driver [None req-961f59cd-1eea-45c2-b8e8-daa58d560110 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 574.191267] env[59659]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 574.191421] env[59659]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 574.191505] env[59659]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=59659) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 577.400070] env[59659]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-8ff68c31-8d64-4348-96d9-7beb2687ed7d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.415500] env[59659]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=59659) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 577.415626] env[59659]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-29f01f89-1129-4118-8f81-79ee8dd037d7 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.449944] env[59659]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 36e91. [ 577.450082] env[59659]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.259s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 577.450720] env[59659]: INFO nova.virt.vmwareapi.driver [None req-961f59cd-1eea-45c2-b8e8-daa58d560110 None None] VMware vCenter version: 7.0.3 [ 577.454143] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16f64a31-76b4-4d0c-8b2e-39d1c28fda7a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.471241] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-968f0673-d0e1-461c-b264-b9a2225287b4 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.477322] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fd0fcf0-954d-446f-81ea-0ff4642aa01d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.484092] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc32cd20-dca1-4d7f-a3a0-e360a6f1b3bc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.497197] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da60d07a-6b4a-4a89-96c5-4a286bdbca5b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.502975] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d3773d4-d037-4469-ab7e-31d13111bc9d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.533194] env[59659]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-a9b7b0db-a578-4427-b1ee-7535cabb75f5 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.538431] env[59659]: DEBUG nova.virt.vmwareapi.driver [None req-961f59cd-1eea-45c2-b8e8-daa58d560110 None None] Extension org.openstack.compute already exists. {{(pid=59659) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 577.541240] env[59659]: INFO nova.compute.provider_config [None req-961f59cd-1eea-45c2-b8e8-daa58d560110 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 577.558486] env[59659]: DEBUG nova.context [None req-961f59cd-1eea-45c2-b8e8-daa58d560110 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),74f912f4-43d8-4d2a-9ea1-d6a83c370e35(cell1) {{(pid=59659) load_cells /opt/stack/nova/nova/context.py:464}} [ 577.560450] env[59659]: DEBUG oslo_concurrency.lockutils [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 577.560666] env[59659]: DEBUG oslo_concurrency.lockutils [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 577.561479] env[59659]: DEBUG oslo_concurrency.lockutils [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 577.561830] env[59659]: DEBUG oslo_concurrency.lockutils [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] Acquiring lock "74f912f4-43d8-4d2a-9ea1-d6a83c370e35" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 577.562036] env[59659]: DEBUG oslo_concurrency.lockutils [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] Lock "74f912f4-43d8-4d2a-9ea1-d6a83c370e35" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 577.562990] env[59659]: DEBUG oslo_concurrency.lockutils [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] Lock "74f912f4-43d8-4d2a-9ea1-d6a83c370e35" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 577.575765] env[59659]: DEBUG oslo_db.sqlalchemy.engines [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59659) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 577.577818] env[59659]: DEBUG oslo_db.sqlalchemy.engines [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59659) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 577.582322] env[59659]: ERROR nova.db.main.api [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 577.582322] env[59659]: result = function(*args, **kwargs) [ 577.582322] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 577.582322] env[59659]: return func(*args, **kwargs) [ 577.582322] env[59659]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 577.582322] env[59659]: result = fn(*args, **kwargs) [ 577.582322] env[59659]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 577.582322] env[59659]: return f(*args, **kwargs) [ 577.582322] env[59659]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 577.582322] env[59659]: return db.service_get_minimum_version(context, binaries) [ 577.582322] env[59659]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 577.582322] env[59659]: _check_db_access() [ 577.582322] env[59659]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 577.582322] env[59659]: stacktrace = ''.join(traceback.format_stack()) [ 577.582322] env[59659]: [ 577.588060] env[59659]: ERROR nova.db.main.api [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 577.588060] env[59659]: result = function(*args, **kwargs) [ 577.588060] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 577.588060] env[59659]: return func(*args, **kwargs) [ 577.588060] env[59659]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 577.588060] env[59659]: result = fn(*args, **kwargs) [ 577.588060] env[59659]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 577.588060] env[59659]: return f(*args, **kwargs) [ 577.588060] env[59659]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 577.588060] env[59659]: return db.service_get_minimum_version(context, binaries) [ 577.588060] env[59659]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 577.588060] env[59659]: _check_db_access() [ 577.588060] env[59659]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 577.588060] env[59659]: stacktrace = ''.join(traceback.format_stack()) [ 577.588060] env[59659]: [ 577.588608] env[59659]: WARNING nova.objects.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 577.588797] env[59659]: WARNING nova.objects.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] Failed to get minimum service version for cell 74f912f4-43d8-4d2a-9ea1-d6a83c370e35 [ 577.589218] env[59659]: DEBUG oslo_concurrency.lockutils [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] Acquiring lock "singleton_lock" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 577.589374] env[59659]: DEBUG oslo_concurrency.lockutils [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] Acquired lock "singleton_lock" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 577.589610] env[59659]: DEBUG oslo_concurrency.lockutils [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] Releasing lock "singleton_lock" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 577.589927] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] Full set of CONF: {{(pid=59659) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 577.590087] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ******************************************************************************** {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 577.590214] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] Configuration options gathered from: {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 577.590346] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 577.590536] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 577.590661] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ================================================================================ {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 577.590866] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] allow_resize_to_same_host = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.591075] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] arq_binding_timeout = 300 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.591213] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] backdoor_port = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.591340] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] backdoor_socket = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.591556] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] block_device_allocate_retries = 60 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.591680] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] block_device_allocate_retries_interval = 3 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.591846] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cert = self.pem {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.592047] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.592220] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] compute_monitors = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.592382] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] config_dir = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.592549] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] config_drive_format = iso9660 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.592681] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.592842] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] config_source = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.593013] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] console_host = devstack {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.593188] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] control_exchange = nova {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.593369] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cpu_allocation_ratio = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.593531] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] daemon = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.593694] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] debug = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.593846] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] default_access_ip_network_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.594051] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] default_availability_zone = nova {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.594235] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] default_ephemeral_format = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.594472] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.594633] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] default_schedule_zone = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.594787] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] disk_allocation_ratio = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.594950] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] enable_new_services = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.595135] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] enabled_apis = ['osapi_compute'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.595295] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] enabled_ssl_apis = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.595452] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] flat_injected = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.595607] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] force_config_drive = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.595760] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] force_raw_images = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.595926] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] graceful_shutdown_timeout = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.596095] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] heal_instance_info_cache_interval = 60 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.596310] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] host = cpu-1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.596507] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] initial_cpu_allocation_ratio = 4.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.596669] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] initial_disk_allocation_ratio = 1.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.596825] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] initial_ram_allocation_ratio = 1.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.597143] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.597201] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] instance_build_timeout = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.597360] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] instance_delete_interval = 300 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.597527] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] instance_format = [instance: %(uuid)s] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.597690] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] instance_name_template = instance-%08x {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.597850] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] instance_usage_audit = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.598030] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] instance_usage_audit_period = month {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.598198] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.598363] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] instances_path = /opt/stack/data/nova/instances {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.598527] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] internal_service_availability_zone = internal {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.598684] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] key = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.598842] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] live_migration_retry_count = 30 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.599023] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] log_config_append = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.599184] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.599342] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] log_dir = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.599498] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] log_file = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.599625] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] log_options = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.599781] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] log_rotate_interval = 1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.599948] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] log_rotate_interval_type = days {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.600127] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] log_rotation_type = none {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.600256] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.600379] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.600544] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.600706] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.600830] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.601034] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] long_rpc_timeout = 1800 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.601201] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] max_concurrent_builds = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.601364] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] max_concurrent_live_migrations = 1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.601519] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] max_concurrent_snapshots = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.601675] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] max_local_block_devices = 3 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.601828] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] max_logfile_count = 30 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.602034] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] max_logfile_size_mb = 200 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.602208] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] maximum_instance_delete_attempts = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.602379] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] metadata_listen = 0.0.0.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.602547] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] metadata_listen_port = 8775 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.602713] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] metadata_workers = 2 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.602876] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] migrate_max_retries = -1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.603059] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] mkisofs_cmd = genisoimage {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.603269] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] my_block_storage_ip = 10.180.1.21 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.603400] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] my_ip = 10.180.1.21 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.603561] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] network_allocate_retries = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.603738] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.603903] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] osapi_compute_listen = 0.0.0.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.604076] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] osapi_compute_listen_port = 8774 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.604243] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] osapi_compute_unique_server_name_scope = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.604408] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] osapi_compute_workers = 2 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.604567] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] password_length = 12 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.604724] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] periodic_enable = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.604882] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] periodic_fuzzy_delay = 60 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.605057] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] pointer_model = usbtablet {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.605221] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] preallocate_images = none {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.605380] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] publish_errors = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.605510] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] pybasedir = /opt/stack/nova {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.605662] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ram_allocation_ratio = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.605816] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] rate_limit_burst = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.605976] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] rate_limit_except_level = CRITICAL {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.606180] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] rate_limit_interval = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.606347] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] reboot_timeout = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.606505] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] reclaim_instance_interval = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.606660] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] record = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.606824] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] reimage_timeout_per_gb = 60 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.606991] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] report_interval = 120 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.607166] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] rescue_timeout = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.607322] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] reserved_host_cpus = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.607478] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] reserved_host_disk_mb = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.607642] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] reserved_host_memory_mb = 512 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.607945] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] reserved_huge_pages = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.607945] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] resize_confirm_window = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.608100] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] resize_fs_using_block_device = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.608262] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] resume_guests_state_on_host_boot = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.608428] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.608582] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] rpc_response_timeout = 60 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.608736] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] run_external_periodic_tasks = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.608907] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] running_deleted_instance_action = reap {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.609068] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] running_deleted_instance_poll_interval = 1800 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.609224] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] running_deleted_instance_timeout = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.609379] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] scheduler_instance_sync_interval = 120 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.609511] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] service_down_time = 300 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.609675] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] servicegroup_driver = db {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.609834] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] shelved_offload_time = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.609991] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] shelved_poll_interval = 3600 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.610171] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] shutdown_timeout = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.610330] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] source_is_ipv6 = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.610487] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ssl_only = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.610728] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.610917] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] sync_power_state_interval = 600 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.611087] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] sync_power_state_pool_size = 1000 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.611268] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] syslog_log_facility = LOG_USER {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.611418] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] tempdir = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.611572] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] timeout_nbd = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.611776] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] transport_url = **** {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.612072] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] update_resources_interval = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.612142] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] use_cow_images = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.612255] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] use_eventlog = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.612411] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] use_journal = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.612563] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] use_json = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.612717] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] use_rootwrap_daemon = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.612869] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] use_stderr = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.613031] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] use_syslog = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.613187] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vcpu_pin_set = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.613350] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vif_plugging_is_fatal = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.613514] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vif_plugging_timeout = 300 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.613677] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] virt_mkfs = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.613834] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] volume_usage_poll_interval = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.613989] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] watch_log_file = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.614168] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] web = /usr/share/spice-html5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 577.614356] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_concurrency.disable_process_locking = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.614952] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.615169] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.615343] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.615522] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_metrics.metrics_process_name = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.615692] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.615863] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.616066] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.auth_strategy = keystone {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.616239] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.compute_link_prefix = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.616416] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.616591] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.dhcp_domain = novalocal {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.616758] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.enable_instance_password = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.616923] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.glance_link_prefix = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.617115] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.617350] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.instance_list_cells_batch_strategy = distributed {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.617524] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.instance_list_per_project_cells = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.617687] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.list_records_by_skipping_down_cells = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.617848] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.local_metadata_per_cell = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.618039] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.max_limit = 1000 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.618236] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.metadata_cache_expiration = 15 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.618422] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.neutron_default_tenant_id = default {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.618592] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.use_forwarded_for = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.618760] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.use_neutron_default_nets = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.618927] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.619106] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.vendordata_dynamic_failure_fatal = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.619275] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.619447] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.vendordata_dynamic_ssl_certfile = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.619616] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.vendordata_dynamic_targets = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.619783] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.vendordata_jsonfile_path = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.619963] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api.vendordata_providers = ['StaticJSON'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.620182] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.backend = dogpile.cache.memcached {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.620355] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.backend_argument = **** {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.620521] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.config_prefix = cache.oslo {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.620688] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.dead_timeout = 60.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.620851] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.debug_cache_backend = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.621060] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.enable_retry_client = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.621237] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.enable_socket_keepalive = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.621432] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.enabled = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.621566] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.expiration_time = 600 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.621771] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.hashclient_retry_attempts = 2 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.622036] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.hashclient_retry_delay = 1.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.622231] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.memcache_dead_retry = 300 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.622407] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.memcache_password = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.622574] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.622743] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.622906] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.memcache_pool_maxsize = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.623082] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.memcache_pool_unused_timeout = 60 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.623249] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.memcache_sasl_enabled = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.623428] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.memcache_servers = ['localhost:11211'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.623594] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.memcache_socket_timeout = 1.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.623763] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.memcache_username = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.623930] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.proxies = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.624108] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.retry_attempts = 2 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.624284] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.retry_delay = 0.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.624447] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.socket_keepalive_count = 1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.624610] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.socket_keepalive_idle = 1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.624765] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.socket_keepalive_interval = 1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.624921] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.tls_allowed_ciphers = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.625088] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.tls_cafile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.625246] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.tls_certfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.625405] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.tls_enabled = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.625558] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cache.tls_keyfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.625729] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cinder.auth_section = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.625903] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cinder.auth_type = password {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.626078] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cinder.cafile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.626253] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cinder.catalog_info = volumev3::publicURL {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.626414] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cinder.certfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.626596] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cinder.collect_timing = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.626805] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cinder.cross_az_attach = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.626980] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cinder.debug = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.627161] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cinder.endpoint_template = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.627326] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cinder.http_retries = 3 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.627488] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cinder.insecure = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.627649] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cinder.keyfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.627819] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cinder.os_region_name = RegionOne {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.627984] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cinder.split_loggers = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.628165] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cinder.timeout = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.628338] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.628497] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] compute.cpu_dedicated_set = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.628652] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] compute.cpu_shared_set = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.628843] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] compute.image_type_exclude_list = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.628972] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] compute.live_migration_wait_for_vif_plug = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.629148] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] compute.max_concurrent_disk_ops = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.629309] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] compute.max_disk_devices_to_attach = -1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.629470] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.629636] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.629796] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] compute.resource_provider_association_refresh = 300 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.629955] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] compute.shutdown_retry_interval = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.630150] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.630329] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] conductor.workers = 2 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.630507] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] console.allowed_origins = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.630663] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] console.ssl_ciphers = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.630831] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] console.ssl_minimum_version = default {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.631042] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] consoleauth.token_ttl = 600 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.631238] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.cafile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.631417] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.certfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.631586] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.collect_timing = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.631744] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.connect_retries = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.631930] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.connect_retry_delay = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.632186] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.endpoint_override = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.632279] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.insecure = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.632433] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.keyfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.632590] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.max_version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.632745] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.min_version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.632899] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.region_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.633071] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.service_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.633243] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.service_type = accelerator {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.633406] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.split_loggers = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.633561] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.status_code_retries = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.633717] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.status_code_retry_delay = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.633874] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.timeout = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.634066] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.634236] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] cyborg.version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.634417] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.backend = sqlalchemy {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.634595] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.connection = **** {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.634769] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.connection_debug = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.634937] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.connection_parameters = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.635118] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.connection_recycle_time = 3600 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.635291] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.connection_trace = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.635457] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.db_inc_retry_interval = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.635618] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.db_max_retries = 20 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.635779] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.db_max_retry_interval = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.635939] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.db_retry_interval = 1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.636126] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.max_overflow = 50 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.636289] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.max_pool_size = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.636455] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.max_retries = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.636618] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.mysql_enable_ndb = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.636785] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.mysql_sql_mode = TRADITIONAL {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.636946] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.mysql_wsrep_sync_wait = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.637122] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.pool_timeout = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.637294] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.retry_interval = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.637451] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.slave_connection = **** {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.637615] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.sqlite_synchronous = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.638072] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] database.use_db_reconnect = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.638072] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.backend = sqlalchemy {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.638145] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.connection = **** {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.638286] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.connection_debug = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.638455] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.connection_parameters = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.638619] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.connection_recycle_time = 3600 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.638785] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.connection_trace = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.638957] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.db_inc_retry_interval = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.639126] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.db_max_retries = 20 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.639289] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.db_max_retry_interval = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.639451] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.db_retry_interval = 1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.639618] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.max_overflow = 50 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.639779] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.max_pool_size = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.639945] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.max_retries = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.640123] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.mysql_enable_ndb = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.640296] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.640452] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.mysql_wsrep_sync_wait = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.640611] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.pool_timeout = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.640788] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.retry_interval = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.640974] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.slave_connection = **** {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.641172] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] api_database.sqlite_synchronous = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.641347] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] devices.enabled_mdev_types = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.641522] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.641687] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ephemeral_storage_encryption.enabled = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.641850] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ephemeral_storage_encryption.key_size = 512 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.642159] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.api_servers = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.642355] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.cafile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.642614] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.certfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.642892] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.collect_timing = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.643180] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.connect_retries = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.643433] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.connect_retry_delay = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.643684] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.debug = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.643962] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.default_trusted_certificate_ids = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.644226] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.enable_certificate_validation = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.644473] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.enable_rbd_download = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.644672] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.endpoint_override = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.644851] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.insecure = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.645029] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.keyfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.645194] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.max_version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.645350] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.min_version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.645512] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.num_retries = 3 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.645680] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.rbd_ceph_conf = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.645843] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.rbd_connect_timeout = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.646019] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.rbd_pool = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.646189] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.rbd_user = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.646347] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.region_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.646503] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.service_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.646669] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.service_type = image {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.646827] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.split_loggers = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.646984] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.status_code_retries = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.647154] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.status_code_retry_delay = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.647311] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.timeout = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.647488] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.647647] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.verify_glance_signatures = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.647804] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] glance.version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.647969] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] guestfs.debug = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.648160] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.config_drive_cdrom = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.648326] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.config_drive_inject_password = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.649429] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.649429] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.enable_instance_metrics_collection = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.649429] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.enable_remotefx = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.649429] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.instances_path_share = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.649429] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.iscsi_initiator_list = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.649429] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.limit_cpu_features = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.649429] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.649696] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.649744] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.power_state_check_timeframe = 60 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.649901] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.power_state_event_polling_interval = 2 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.650119] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.650261] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.use_multipath_io = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.650424] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.volume_attach_retry_count = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.650583] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.volume_attach_retry_interval = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.650740] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.vswitch_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.650913] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.651112] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] mks.enabled = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.651590] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.651670] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] image_cache.manager_interval = 2400 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.651838] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] image_cache.precache_concurrency = 1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.652040] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] image_cache.remove_unused_base_images = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.652219] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.652388] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.652565] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] image_cache.subdirectory_name = _base {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.652740] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.api_max_retries = 60 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.652904] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.api_retry_interval = 2 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.653077] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.auth_section = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.653246] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.auth_type = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.653405] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.cafile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.653560] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.certfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.653721] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.collect_timing = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.653874] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.connect_retries = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.654041] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.connect_retry_delay = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.654194] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.endpoint_override = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.654351] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.insecure = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.654503] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.keyfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.654654] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.max_version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.654803] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.min_version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.654954] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.partition_key = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.657851] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.peer_list = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.657851] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.region_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.657851] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.serial_console_state_timeout = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.657851] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.service_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.657851] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.service_type = baremetal {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.657851] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.split_loggers = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.657851] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.status_code_retries = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658137] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.status_code_retry_delay = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658137] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.timeout = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658137] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658137] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ironic.version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658137] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658137] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] key_manager.fixed_key = **** {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658322] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658322] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.barbican_api_version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658322] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.barbican_endpoint = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658322] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.barbican_endpoint_type = public {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658322] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.barbican_region_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658322] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.cafile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658322] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.certfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658533] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.collect_timing = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658533] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.insecure = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658680] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.keyfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.658845] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.number_of_retries = 60 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.659021] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.retry_delay = 1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.659183] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.send_service_user_token = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.659345] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.split_loggers = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.659500] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.timeout = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.659660] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.verify_ssl = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.659815] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican.verify_ssl_path = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.659980] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican_service_user.auth_section = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.660168] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican_service_user.auth_type = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.660327] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican_service_user.cafile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.660484] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican_service_user.certfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.660645] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican_service_user.collect_timing = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.660803] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican_service_user.insecure = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.660993] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican_service_user.keyfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.661201] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican_service_user.split_loggers = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.661395] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] barbican_service_user.timeout = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.661583] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.approle_role_id = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.661745] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.approle_secret_id = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.661924] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.cafile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.662135] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.certfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.662317] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.collect_timing = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.662480] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.insecure = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.662637] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.keyfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.662808] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.kv_mountpoint = secret {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.662976] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.kv_version = 2 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.663148] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.namespace = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.663309] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.root_token_id = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.663471] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.split_loggers = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.663629] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.ssl_ca_crt_file = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.663787] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.timeout = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.663946] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.use_ssl = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.664133] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.664302] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.cafile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.664460] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.certfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.664623] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.collect_timing = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.664781] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.connect_retries = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.664939] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.connect_retry_delay = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.665109] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.endpoint_override = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.665274] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.insecure = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.665429] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.keyfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.665582] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.max_version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.665737] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.min_version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.665893] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.region_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.666059] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.service_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.666230] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.service_type = identity {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.666392] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.split_loggers = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.666549] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.status_code_retries = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.666704] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.status_code_retry_delay = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.666860] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.timeout = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.667048] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.667214] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] keystone.version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.667414] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.connection_uri = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.667575] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.cpu_mode = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.667746] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.cpu_model_extra_flags = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.667903] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.cpu_models = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.668088] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.cpu_power_governor_high = performance {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.668261] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.cpu_power_governor_low = powersave {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.668427] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.cpu_power_management = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.668593] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.668759] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.device_detach_attempts = 8 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.668934] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.device_detach_timeout = 20 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.669103] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.disk_cachemodes = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.669285] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.disk_prefix = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.669427] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.enabled_perf_events = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.669588] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.file_backed_memory = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.669752] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.gid_maps = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.669909] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.hw_disk_discard = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.670081] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.hw_machine_type = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.670251] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.images_rbd_ceph_conf = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.670414] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.670578] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.670745] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.images_rbd_glance_store_name = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.670938] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.images_rbd_pool = rbd {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.671134] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.images_type = default {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.671295] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.images_volume_group = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.671456] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.inject_key = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.671619] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.inject_partition = -2 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.671778] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.inject_password = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.671968] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.iscsi_iface = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.672183] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.iser_use_multipath = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.672356] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.live_migration_bandwidth = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.672521] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.live_migration_completion_timeout = 800 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.672686] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.live_migration_downtime = 500 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.672849] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.live_migration_downtime_delay = 75 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.673068] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.live_migration_downtime_steps = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.673185] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.live_migration_inbound_addr = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.673350] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.live_migration_permit_auto_converge = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.673511] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.live_migration_permit_post_copy = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.673687] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.live_migration_scheme = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.673862] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.live_migration_timeout_action = abort {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.674037] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.live_migration_tunnelled = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.674203] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.live_migration_uri = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.674366] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.live_migration_with_native_tls = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.674524] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.max_queues = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.674686] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.mem_stats_period_seconds = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.674843] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.nfs_mount_options = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.675175] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.675350] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.num_aoe_discover_tries = 3 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.675515] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.num_iser_scan_tries = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.675676] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.num_memory_encrypted_guests = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.675842] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.num_nvme_discover_tries = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.676024] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.num_pcie_ports = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.676198] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.num_volume_scan_tries = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.676366] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.pmem_namespaces = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.676527] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.quobyte_client_cfg = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.676811] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.676984] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.rbd_connect_timeout = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.677168] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.677334] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.677496] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.rbd_secret_uuid = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.677654] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.rbd_user = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.677819] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.realtime_scheduler_priority = 1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.677991] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.remote_filesystem_transport = ssh {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.678170] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.rescue_image_id = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.678326] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.rescue_kernel_id = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.678483] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.rescue_ramdisk_id = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.678649] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.rng_dev_path = /dev/urandom {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.678841] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.rx_queue_size = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.678974] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.smbfs_mount_options = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.679263] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.679434] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.snapshot_compression = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.679597] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.snapshot_image_format = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.679810] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.679978] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.sparse_logical_volumes = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.680164] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.swtpm_enabled = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.680335] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.swtpm_group = tss {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.680505] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.swtpm_user = tss {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.680674] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.sysinfo_serial = unique {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.680834] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.tx_queue_size = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.681037] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.uid_maps = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.681217] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.use_virtio_for_bridges = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.681390] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.virt_type = kvm {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.681561] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.volume_clear = zero {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.681725] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.volume_clear_size = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.681910] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.volume_use_multipath = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.682117] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.vzstorage_cache_path = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.682303] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.682475] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.vzstorage_mount_group = qemu {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.682645] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.vzstorage_mount_opts = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.682811] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.683099] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.683284] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.vzstorage_mount_user = stack {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.683450] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.683622] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.auth_section = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.683793] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.auth_type = password {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.683956] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.cafile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.684134] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.certfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.684302] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.collect_timing = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.684462] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.connect_retries = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.684620] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.connect_retry_delay = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.684790] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.default_floating_pool = public {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.684950] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.endpoint_override = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.685132] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.extension_sync_interval = 600 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.685298] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.http_retries = 3 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.685462] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.insecure = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.685622] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.keyfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.685780] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.max_version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.685949] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.metadata_proxy_shared_secret = **** {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.686123] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.min_version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.686292] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.ovs_bridge = br-int {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.686459] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.physnets = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.686627] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.region_name = RegionOne {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.686796] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.service_metadata_proxy = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.686958] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.service_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.687139] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.service_type = network {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.687304] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.split_loggers = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.687462] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.status_code_retries = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.687621] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.status_code_retry_delay = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.687780] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.timeout = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.687959] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.688147] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] neutron.version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.688321] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] notifications.bdms_in_notifications = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.688499] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] notifications.default_level = INFO {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.688671] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] notifications.notification_format = unversioned {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.688834] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] notifications.notify_on_state_change = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.689014] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.689208] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] pci.alias = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.689378] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] pci.device_spec = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.689542] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] pci.report_in_placement = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.689712] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.auth_section = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.689885] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.auth_type = password {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.690066] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.auth_url = http://10.180.1.21/identity {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.690229] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.cafile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.690388] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.certfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.690551] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.collect_timing = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.690709] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.connect_retries = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.690866] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.connect_retry_delay = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.691062] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.default_domain_id = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.691228] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.default_domain_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.691390] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.domain_id = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.691542] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.domain_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.691697] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.endpoint_override = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.691858] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.insecure = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.692075] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.keyfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.692252] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.max_version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.692412] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.min_version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.692582] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.password = **** {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.692741] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.project_domain_id = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.692911] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.project_domain_name = Default {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.693087] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.project_id = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.693301] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.project_name = service {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.693429] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.region_name = RegionOne {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.693589] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.service_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.693754] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.service_type = placement {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.693917] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.split_loggers = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.694122] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.status_code_retries = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.694309] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.status_code_retry_delay = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.694473] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.system_scope = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.694628] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.timeout = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.694787] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.trust_id = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.694943] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.user_domain_id = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.695128] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.user_domain_name = Default {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.695289] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.user_id = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.695460] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.username = placement {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.695643] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.695803] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] placement.version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.695979] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] quota.cores = 20 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.696160] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] quota.count_usage_from_placement = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.696330] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.696503] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] quota.injected_file_content_bytes = 10240 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.696668] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] quota.injected_file_path_length = 255 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.696834] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] quota.injected_files = 5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.696998] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] quota.instances = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.697183] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] quota.key_pairs = 100 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.697352] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] quota.metadata_items = 128 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.697518] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] quota.ram = 51200 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.697680] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] quota.recheck_quota = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.697845] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] quota.server_group_members = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.698018] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] quota.server_groups = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.698202] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] rdp.enabled = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.698517] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.698707] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.698879] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.699057] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] scheduler.image_metadata_prefilter = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.699223] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.699387] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] scheduler.max_attempts = 3 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.699551] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] scheduler.max_placement_results = 1000 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.699715] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.699878] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] scheduler.query_placement_for_availability_zone = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.700056] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] scheduler.query_placement_for_image_type_support = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.700221] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.700396] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] scheduler.workers = 2 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.700567] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.700734] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.700930] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.701130] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.701306] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.701470] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.701634] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.701824] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.702056] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.host_subset_size = 1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.702255] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.image_properties_default_architecture = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.702427] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.702596] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.isolated_hosts = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.702760] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.isolated_images = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.702923] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.max_instances_per_host = 50 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.703101] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.703268] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.pci_in_placement = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.703437] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.703595] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.703754] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.703914] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.704090] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.704261] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.704426] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.track_instance_changes = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.704603] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.704776] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] metrics.required = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.704941] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] metrics.weight_multiplier = 1.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.705118] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] metrics.weight_of_unavailable = -10000.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.705284] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] metrics.weight_setting = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.705574] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.705747] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] serial_console.enabled = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.705923] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] serial_console.port_range = 10000:20000 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.706107] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.706277] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.706441] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] serial_console.serialproxy_port = 6083 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.706606] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] service_user.auth_section = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.706775] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] service_user.auth_type = password {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.706933] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] service_user.cafile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.707105] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] service_user.certfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.707269] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] service_user.collect_timing = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.707430] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] service_user.insecure = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.707584] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] service_user.keyfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.707757] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] service_user.send_service_user_token = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.707915] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] service_user.split_loggers = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.708087] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] service_user.timeout = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.708260] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] spice.agent_enabled = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.708446] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] spice.enabled = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.708734] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.708928] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] spice.html5proxy_host = 0.0.0.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.709136] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] spice.html5proxy_port = 6082 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.709279] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] spice.image_compression = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.709439] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] spice.jpeg_compression = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.709600] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] spice.playback_compression = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.709770] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] spice.server_listen = 127.0.0.1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.709938] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.710116] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] spice.streaming_mode = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.710279] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] spice.zlib_compression = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.710448] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] upgrade_levels.baseapi = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.710607] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] upgrade_levels.cert = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.710779] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] upgrade_levels.compute = auto {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.710964] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] upgrade_levels.conductor = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.711160] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] upgrade_levels.scheduler = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.711333] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vendordata_dynamic_auth.auth_section = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.711497] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vendordata_dynamic_auth.auth_type = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.711653] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vendordata_dynamic_auth.cafile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.711810] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vendordata_dynamic_auth.certfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.712009] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vendordata_dynamic_auth.collect_timing = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.712206] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vendordata_dynamic_auth.insecure = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.712369] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vendordata_dynamic_auth.keyfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.712532] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vendordata_dynamic_auth.split_loggers = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.712689] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vendordata_dynamic_auth.timeout = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.712862] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.api_retry_count = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.713034] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.ca_file = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.713215] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.cache_prefix = devstack-image-cache {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.713384] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.cluster_name = testcl1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.713552] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.connection_pool_size = 10 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.713708] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.console_delay_seconds = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.713876] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.datastore_regex = ^datastore.* {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.714100] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.714273] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.host_password = **** {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.714439] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.host_port = 443 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.714608] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.host_username = administrator@vsphere.local {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.714775] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.insecure = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.714936] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.integration_bridge = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.715116] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.maximum_objects = 100 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.715278] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.pbm_default_policy = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.715441] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.pbm_enabled = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.715599] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.pbm_wsdl_location = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.715768] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.715926] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.serial_port_proxy_uri = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.716099] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.serial_port_service_uri = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.716268] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.task_poll_interval = 0.5 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.716443] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.use_linked_clone = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.716607] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.vnc_keymap = en-us {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.716772] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.vnc_port = 5900 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.716935] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vmware.vnc_port_total = 10000 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.717137] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vnc.auth_schemes = ['none'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.717314] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vnc.enabled = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.717595] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.717780] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.717948] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vnc.novncproxy_port = 6080 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.718139] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vnc.server_listen = 127.0.0.1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.718315] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.718476] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vnc.vencrypt_ca_certs = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.718634] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vnc.vencrypt_client_cert = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.718792] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vnc.vencrypt_client_key = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.718967] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.719145] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.disable_deep_image_inspection = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.719309] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.disable_fallback_pcpu_query = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.719478] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.disable_group_policy_check_upcall = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.719638] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.719801] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.disable_rootwrap = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.719960] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.enable_numa_live_migration = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.720138] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.720297] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.720458] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.handle_virt_lifecycle_events = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.720618] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.libvirt_disable_apic = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.720775] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.never_download_image_if_on_rbd = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.720977] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.721168] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.721332] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.721492] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.721652] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.721811] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.721993] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.722179] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.722345] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.722529] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.722697] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] wsgi.client_socket_timeout = 900 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.722863] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] wsgi.default_pool_size = 1000 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.723059] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] wsgi.keep_alive = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.723250] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] wsgi.max_header_line = 16384 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.723430] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] wsgi.secure_proxy_ssl_header = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.723624] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] wsgi.ssl_ca_file = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.723751] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] wsgi.ssl_cert_file = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.723910] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] wsgi.ssl_key_file = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.724089] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] wsgi.tcp_keepidle = 600 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.724266] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.724434] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] zvm.ca_file = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.724593] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] zvm.cloud_connector_url = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.726451] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.726646] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] zvm.reachable_timeout = 300 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.726836] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_policy.enforce_new_defaults = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.727024] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_policy.enforce_scope = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.727235] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_policy.policy_default_rule = default {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.727428] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.727605] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_policy.policy_file = policy.yaml {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.727775] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.727938] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.728112] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.728271] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.728431] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.728597] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.728794] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.728948] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] profiler.connection_string = messaging:// {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.729131] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] profiler.enabled = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.729301] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] profiler.es_doc_type = notification {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.729458] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] profiler.es_scroll_size = 10000 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.729620] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] profiler.es_scroll_time = 2m {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.729777] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] profiler.filter_error_trace = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.729941] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] profiler.hmac_keys = SECRET_KEY {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.730126] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] profiler.sentinel_service_name = mymaster {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.730294] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] profiler.socket_timeout = 0.1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.730453] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] profiler.trace_sqlalchemy = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.730617] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] remote_debug.host = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.730771] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] remote_debug.port = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.730973] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.731164] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.731329] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.731492] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.731652] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.731812] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.731981] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.732148] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.732311] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.732469] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.732640] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.732811] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.732981] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.733164] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.733352] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.733533] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.733738] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.733876] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.734055] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.734221] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.734384] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.734546] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.734705] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.734867] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.735044] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.735215] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.ssl = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.735385] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.735550] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.735710] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.735877] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.736057] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_rabbit.ssl_version = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.736250] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.736414] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_notifications.retry = -1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.736593] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.736769] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_messaging_notifications.transport_url = **** {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.736937] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.auth_section = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.737111] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.auth_type = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.737268] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.cafile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.737421] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.certfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.737578] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.collect_timing = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.737731] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.connect_retries = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.737886] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.connect_retry_delay = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.738053] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.endpoint_id = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.738210] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.endpoint_override = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.738368] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.insecure = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.738522] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.keyfile = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.738676] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.max_version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.738829] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.min_version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.738980] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.region_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.739148] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.service_name = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.739303] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.service_type = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.739496] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.split_loggers = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.739613] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.status_code_retries = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.739765] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.status_code_retry_delay = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.739917] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.timeout = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.740091] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.valid_interfaces = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.740245] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_limit.version = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.740410] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_reports.file_event_handler = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.740571] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_reports.file_event_handler_interval = 1 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.740725] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] oslo_reports.log_dir = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.740909] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.741098] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vif_plug_linux_bridge_privileged.group = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.741258] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.741421] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.741580] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.741736] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vif_plug_linux_bridge_privileged.user = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.741922] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.742107] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vif_plug_ovs_privileged.group = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.742272] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vif_plug_ovs_privileged.helper_command = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.742442] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.742615] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.742769] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] vif_plug_ovs_privileged.user = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.742936] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] os_vif_linux_bridge.flat_interface = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.743131] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.743329] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.743510] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.743677] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.743847] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.744009] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.744181] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] os_vif_linux_bridge.vlan_interface = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.744358] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] os_vif_ovs.isolate_vif = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.744524] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.744682] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.744846] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.745016] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] os_vif_ovs.ovsdb_interface = native {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.745182] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] os_vif_ovs.per_port_bridge = False {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.745347] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] os_brick.lock_path = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.745662] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] privsep_osbrick.capabilities = [21] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.745662] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] privsep_osbrick.group = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.745801] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] privsep_osbrick.helper_command = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.745960] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.746135] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] privsep_osbrick.thread_pool_size = 8 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.746288] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] privsep_osbrick.user = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.746455] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.746611] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] nova_sys_admin.group = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.746767] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] nova_sys_admin.helper_command = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.746923] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.747096] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] nova_sys_admin.thread_pool_size = 8 {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.747251] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] nova_sys_admin.user = None {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 577.747375] env[59659]: DEBUG oslo_service.service [None req-96c2b470-ecbb-4910-9dcf-2886a023666d None None] ******************************************************************************** {{(pid=59659) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 577.747806] env[59659]: INFO nova.service [-] Starting compute node (version 0.1.0) [ 577.756320] env[59659]: INFO nova.virt.node [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Generated node identity 69a84459-8a9e-4a6c-afd9-ec42e61132ce [ 577.756553] env[59659]: INFO nova.virt.node [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Wrote node identity 69a84459-8a9e-4a6c-afd9-ec42e61132ce to /opt/stack/data/n-cpu-1/compute_id [ 577.767184] env[59659]: WARNING nova.compute.manager [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Compute nodes ['69a84459-8a9e-4a6c-afd9-ec42e61132ce'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 577.798554] env[59659]: INFO nova.compute.manager [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 577.817794] env[59659]: WARNING nova.compute.manager [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 577.818047] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 577.818648] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 577.818648] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 577.818648] env[59659]: DEBUG nova.compute.resource_tracker [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59659) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 577.819687] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca1c50db-afff-4870-bab1-a58888d10f0e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.828793] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92ca6977-6293-4b84-9725-339a3005764a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.843117] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c86a710e-c8d7-48e1-a6f3-aaaf8d0cb1c3 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.849762] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93ad5cc8-e5d0-46bd-8f3c-30b4a606df9c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.879973] env[59659]: DEBUG nova.compute.resource_tracker [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181417MB free_disk=177GB free_vcpus=48 pci_devices=None {{(pid=59659) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 577.880136] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 577.880315] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 577.892405] env[59659]: WARNING nova.compute.resource_tracker [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] No compute node record for cpu-1:69a84459-8a9e-4a6c-afd9-ec42e61132ce: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 69a84459-8a9e-4a6c-afd9-ec42e61132ce could not be found. [ 577.905387] env[59659]: INFO nova.compute.resource_tracker [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 69a84459-8a9e-4a6c-afd9-ec42e61132ce [ 577.953939] env[59659]: DEBUG nova.compute.resource_tracker [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59659) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 577.954107] env[59659]: DEBUG nova.compute.resource_tracker [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59659) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 578.056462] env[59659]: INFO nova.scheduler.client.report [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] [req-89fe2f33-32ce-4f22-82fa-236ab80f98fc] Created resource provider record via placement API for resource provider with UUID 69a84459-8a9e-4a6c-afd9-ec42e61132ce and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 578.071706] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7622b314-2cff-4899-8a3b-63ec8ac1fd2e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 578.079145] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d31d5f1-4858-47bb-85cf-a782f59cfebf {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 578.108465] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cadf7ace-f14b-4f14-9457-8703598667dd {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 578.115149] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea38cf4b-aa58-40f0-9e44-00ca20f338e0 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 578.127615] env[59659]: DEBUG nova.compute.provider_tree [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Updating inventory in ProviderTree for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 578.161309] env[59659]: DEBUG nova.scheduler.client.report [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Updated inventory for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 578.161518] env[59659]: DEBUG nova.compute.provider_tree [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Updating resource provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce generation from 0 to 1 during operation: update_inventory {{(pid=59659) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 578.161665] env[59659]: DEBUG nova.compute.provider_tree [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Updating inventory in ProviderTree for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 578.204426] env[59659]: DEBUG nova.compute.provider_tree [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Updating resource provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce generation from 1 to 2 during operation: update_traits {{(pid=59659) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 578.222051] env[59659]: DEBUG nova.compute.resource_tracker [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59659) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 578.222435] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.342s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 578.222435] env[59659]: DEBUG nova.service [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Creating RPC server for service compute {{(pid=59659) start /opt/stack/nova/nova/service.py:182}} [ 578.235303] env[59659]: DEBUG nova.service [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] Join ServiceGroup membership for this service compute {{(pid=59659) start /opt/stack/nova/nova/service.py:199}} [ 578.235463] env[59659]: DEBUG nova.servicegroup.drivers.db [None req-d378fb9c-2f1e-4c59-86ad-c53a9645ad73 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=59659) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 619.716643] env[59659]: DEBUG oslo_concurrency.lockutils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Acquiring lock "42920efb-be41-4813-b33e-d49c6f4fb47c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 619.718125] env[59659]: DEBUG oslo_concurrency.lockutils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Lock "42920efb-be41-4813-b33e-d49c6f4fb47c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 619.742437] env[59659]: DEBUG nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 619.843383] env[59659]: DEBUG oslo_concurrency.lockutils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 619.843383] env[59659]: DEBUG oslo_concurrency.lockutils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 619.847077] env[59659]: INFO nova.compute.claims [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 620.012931] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b767387-1a0a-4973-93f9-5823a79c8bc8 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.021164] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1072e8b4-f9ed-41c3-837e-3440642ffaad {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.060012] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2540d5b-aa6f-4c1c-a247-b7b9edfde00d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.068151] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3104f8b7-d317-4b10-973d-b4c0bf09d694 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.082844] env[59659]: DEBUG nova.compute.provider_tree [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 620.087704] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Acquiring lock "75398340-5ec7-4e3f-abc1-602f838d7ef3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.087910] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Lock "75398340-5ec7-4e3f-abc1-602f838d7ef3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.096394] env[59659]: DEBUG nova.scheduler.client.report [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 620.105290] env[59659]: DEBUG nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 620.123182] env[59659]: DEBUG oslo_concurrency.lockutils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.123759] env[59659]: DEBUG nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 620.163750] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.164286] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.165519] env[59659]: INFO nova.compute.claims [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 620.173151] env[59659]: DEBUG nova.compute.utils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 620.173632] env[59659]: DEBUG nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 620.173856] env[59659]: DEBUG nova.network.neutron [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 620.191763] env[59659]: DEBUG nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 620.308897] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fa87bdb-d0cf-47af-85af-2b5dc1da788d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.313240] env[59659]: DEBUG nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 620.318836] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b2332fa-c8ac-4b19-a246-4a780b2950cd {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.356340] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac3399df-6ed5-47c2-8258-386af4525a9c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.364261] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad88c8fd-5d7a-4c09-bc05-7a904993669f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.382158] env[59659]: DEBUG nova.compute.provider_tree [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 620.401075] env[59659]: DEBUG nova.scheduler.client.report [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 620.427762] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.428345] env[59659]: DEBUG nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 620.474607] env[59659]: DEBUG nova.compute.utils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 620.475962] env[59659]: DEBUG nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 620.479688] env[59659]: DEBUG nova.network.neutron [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 620.490613] env[59659]: DEBUG nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 620.583841] env[59659]: DEBUG nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 620.761374] env[59659]: DEBUG nova.virt.hardware [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 620.761706] env[59659]: DEBUG nova.virt.hardware [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 620.762261] env[59659]: DEBUG nova.virt.hardware [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 620.762261] env[59659]: DEBUG nova.virt.hardware [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 620.763136] env[59659]: DEBUG nova.virt.hardware [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 620.764469] env[59659]: DEBUG nova.virt.hardware [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 620.764469] env[59659]: DEBUG nova.virt.hardware [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 620.764469] env[59659]: DEBUG nova.virt.hardware [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 620.764668] env[59659]: DEBUG nova.virt.hardware [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 620.764884] env[59659]: DEBUG nova.virt.hardware [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 620.765078] env[59659]: DEBUG nova.virt.hardware [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 620.770824] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9d6186c-70e1-4f80-9cac-1b6eb1329f10 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.779133] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25d2e339-7958-4a13-a2a9-57a66b55389d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.791461] env[59659]: DEBUG nova.virt.hardware [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 620.791461] env[59659]: DEBUG nova.virt.hardware [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 620.791461] env[59659]: DEBUG nova.virt.hardware [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 620.791684] env[59659]: DEBUG nova.virt.hardware [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 620.791684] env[59659]: DEBUG nova.virt.hardware [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 620.791684] env[59659]: DEBUG nova.virt.hardware [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 620.791684] env[59659]: DEBUG nova.virt.hardware [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 620.791684] env[59659]: DEBUG nova.virt.hardware [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 620.791827] env[59659]: DEBUG nova.virt.hardware [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 620.791992] env[59659]: DEBUG nova.virt.hardware [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 620.792171] env[59659]: DEBUG nova.virt.hardware [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 620.793035] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca4bd90a-81fe-4c13-a9b6-2df1ad902b19 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.814545] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb214c24-f165-40e8-a526-5e5fc4096815 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.837261] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1625d304-fba8-4004-b28d-c788eba4fa14 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.025302] env[59659]: DEBUG nova.policy [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c807c41f5b8647b982f74a260d5b3c39', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acb2cb42f7814275b0d7369f2e7ab372', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 621.030935] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Acquiring lock "5e7637fe-8828-4c16-a629-0d82f1efded9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 621.034019] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Lock "5e7637fe-8828-4c16-a629-0d82f1efded9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 621.045030] env[59659]: DEBUG nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 621.121767] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 621.122053] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 621.123600] env[59659]: INFO nova.compute.claims [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 621.132372] env[59659]: DEBUG nova.policy [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '77b71019b4134229ae7abdc61869bdd6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fa8a67eb479240bfa25cc71563cdde3d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 621.309421] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c72643cb-d661-4874-bdd6-e2a8d911334e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.321853] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f35b486-fb35-4cc7-af4d-fd3c25971eb0 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.358258] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-300ac010-a333-497a-9dfb-f79a52033b6b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.370439] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaf1cb3d-b369-4ca4-89eb-2b4651dd83ac {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.387236] env[59659]: DEBUG nova.compute.provider_tree [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 621.397057] env[59659]: DEBUG nova.scheduler.client.report [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 621.411614] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.289s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 621.414095] env[59659]: DEBUG nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 621.456795] env[59659]: DEBUG nova.compute.utils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 621.458996] env[59659]: DEBUG nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 621.459379] env[59659]: DEBUG nova.network.neutron [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 621.480901] env[59659]: DEBUG nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 621.582023] env[59659]: DEBUG nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 621.616788] env[59659]: DEBUG nova.virt.hardware [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 621.618637] env[59659]: DEBUG nova.virt.hardware [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 621.618637] env[59659]: DEBUG nova.virt.hardware [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 621.618637] env[59659]: DEBUG nova.virt.hardware [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 621.618637] env[59659]: DEBUG nova.virt.hardware [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 621.618637] env[59659]: DEBUG nova.virt.hardware [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 621.618859] env[59659]: DEBUG nova.virt.hardware [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 621.618859] env[59659]: DEBUG nova.virt.hardware [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 621.618859] env[59659]: DEBUG nova.virt.hardware [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 621.622761] env[59659]: DEBUG nova.virt.hardware [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 621.622761] env[59659]: DEBUG nova.virt.hardware [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 621.624244] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aae6329c-1cd4-4bca-b1c9-3c89734a391b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.637995] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a905f819-3133-4ffa-8811-c3dd92037e16 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.874982] env[59659]: DEBUG nova.policy [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8161a460c49451dbc88db22452e20f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da38ec4f2cad4c0b8b429771350420a8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 622.166145] env[59659]: DEBUG nova.network.neutron [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Successfully created port: 8afd2645-2674-4db5-9c6b-4581a3e8bd46 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 622.862766] env[59659]: DEBUG nova.network.neutron [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Successfully created port: 7fb0cc84-bd6f-4ed9-a2a5-8e37c68b3ed1 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 623.901274] env[59659]: DEBUG nova.network.neutron [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Successfully created port: e7aa5e57-f725-4a6c-a6f6-93fb3aad32ea {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 624.544648] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Acquiring lock "4cdfac2f-6963-4f71-9a42-709f2eeb4f9a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.544901] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Lock "4cdfac2f-6963-4f71-9a42-709f2eeb4f9a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.555536] env[59659]: DEBUG nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 624.611399] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.611399] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 624.612879] env[59659]: INFO nova.compute.claims [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 624.736389] env[59659]: ERROR nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 8afd2645-2674-4db5-9c6b-4581a3e8bd46, please check neutron logs for more information. [ 624.736389] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 624.736389] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 624.736389] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 624.736389] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 624.736389] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 624.736389] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 624.736389] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 624.736389] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 624.736389] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 624.736389] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 624.736389] env[59659]: ERROR nova.compute.manager raise self.value [ 624.736389] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 624.736389] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 624.736389] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 624.736389] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 624.736864] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 624.736864] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 624.736864] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 8afd2645-2674-4db5-9c6b-4581a3e8bd46, please check neutron logs for more information. [ 624.736864] env[59659]: ERROR nova.compute.manager [ 624.736864] env[59659]: Traceback (most recent call last): [ 624.736864] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 624.736864] env[59659]: listener.cb(fileno) [ 624.736864] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 624.736864] env[59659]: result = function(*args, **kwargs) [ 624.736864] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 624.736864] env[59659]: return func(*args, **kwargs) [ 624.736864] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 624.736864] env[59659]: raise e [ 624.736864] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 624.736864] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 624.736864] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 624.736864] env[59659]: created_port_ids = self._update_ports_for_instance( [ 624.736864] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 624.736864] env[59659]: with excutils.save_and_reraise_exception(): [ 624.736864] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 624.736864] env[59659]: self.force_reraise() [ 624.736864] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 624.736864] env[59659]: raise self.value [ 624.736864] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 624.736864] env[59659]: updated_port = self._update_port( [ 624.736864] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 624.736864] env[59659]: _ensure_no_port_binding_failure(port) [ 624.736864] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 624.736864] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 624.737672] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 8afd2645-2674-4db5-9c6b-4581a3e8bd46, please check neutron logs for more information. [ 624.737672] env[59659]: Removing descriptor: 12 [ 624.739612] env[59659]: ERROR nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 8afd2645-2674-4db5-9c6b-4581a3e8bd46, please check neutron logs for more information. [ 624.739612] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Traceback (most recent call last): [ 624.739612] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 624.739612] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] yield resources [ 624.739612] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 624.739612] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] self.driver.spawn(context, instance, image_meta, [ 624.739612] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 624.739612] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 624.739612] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 624.739612] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] vm_ref = self.build_virtual_machine(instance, [ 624.739612] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 624.740122] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] vif_infos = vmwarevif.get_vif_info(self._session, [ 624.740122] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 624.740122] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] for vif in network_info: [ 624.740122] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 624.740122] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] return self._sync_wrapper(fn, *args, **kwargs) [ 624.740122] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 624.740122] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] self.wait() [ 624.740122] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 624.740122] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] self[:] = self._gt.wait() [ 624.740122] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 624.740122] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] return self._exit_event.wait() [ 624.740122] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 624.740122] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] result = hub.switch() [ 624.740122] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 624.740662] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] return self.greenlet.switch() [ 624.740662] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 624.740662] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] result = function(*args, **kwargs) [ 624.740662] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 624.740662] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] return func(*args, **kwargs) [ 624.740662] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 624.740662] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] raise e [ 624.740662] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 624.740662] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] nwinfo = self.network_api.allocate_for_instance( [ 624.740662] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 624.740662] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] created_port_ids = self._update_ports_for_instance( [ 624.740662] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 624.740662] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] with excutils.save_and_reraise_exception(): [ 624.741186] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 624.741186] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] self.force_reraise() [ 624.741186] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 624.741186] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] raise self.value [ 624.741186] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 624.741186] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] updated_port = self._update_port( [ 624.741186] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 624.741186] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] _ensure_no_port_binding_failure(port) [ 624.741186] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 624.741186] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] raise exception.PortBindingFailed(port_id=port['id']) [ 624.741186] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] nova.exception.PortBindingFailed: Binding failed for port 8afd2645-2674-4db5-9c6b-4581a3e8bd46, please check neutron logs for more information. [ 624.741186] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] [ 624.741722] env[59659]: INFO nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Terminating instance [ 624.746279] env[59659]: DEBUG oslo_concurrency.lockutils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Acquiring lock "refresh_cache-42920efb-be41-4813-b33e-d49c6f4fb47c" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 624.746279] env[59659]: DEBUG oslo_concurrency.lockutils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Acquired lock "refresh_cache-42920efb-be41-4813-b33e-d49c6f4fb47c" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 624.746279] env[59659]: DEBUG nova.network.neutron [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 624.757288] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10849fbc-c8ac-449f-9b56-4c6ebbb6b997 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.771192] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94b81204-8d7e-4b49-9ea2-75f805848743 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.805019] env[59659]: DEBUG nova.network.neutron [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 624.808589] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-838e1ecb-3c0e-4c83-860a-bc5893d2ade2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.818907] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7a71a47-670f-4652-9468-a7192837b336 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 624.833832] env[59659]: DEBUG nova.compute.provider_tree [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 624.847539] env[59659]: DEBUG nova.scheduler.client.report [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 624.867556] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 624.868303] env[59659]: DEBUG nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 624.918321] env[59659]: DEBUG nova.compute.utils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 624.923457] env[59659]: DEBUG nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 624.923457] env[59659]: DEBUG nova.network.neutron [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 624.937229] env[59659]: DEBUG nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 625.015565] env[59659]: DEBUG nova.network.neutron [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 625.024044] env[59659]: DEBUG nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 625.034806] env[59659]: DEBUG oslo_concurrency.lockutils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Releasing lock "refresh_cache-42920efb-be41-4813-b33e-d49c6f4fb47c" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 625.035280] env[59659]: DEBUG nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 625.035596] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 625.036254] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-27c4bb22-493e-418a-8d41-f7d54b0cab94 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.048554] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f69ae460-8a9c-4784-97b2-bef1bdacf307 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.062602] env[59659]: DEBUG nova.virt.hardware [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 625.062602] env[59659]: DEBUG nova.virt.hardware [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 625.062602] env[59659]: DEBUG nova.virt.hardware [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 625.063095] env[59659]: DEBUG nova.virt.hardware [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 625.063095] env[59659]: DEBUG nova.virt.hardware [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 625.063095] env[59659]: DEBUG nova.virt.hardware [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 625.063282] env[59659]: DEBUG nova.virt.hardware [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 625.063432] env[59659]: DEBUG nova.virt.hardware [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 625.063623] env[59659]: DEBUG nova.virt.hardware [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 625.063740] env[59659]: DEBUG nova.virt.hardware [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 625.064133] env[59659]: DEBUG nova.virt.hardware [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 625.065577] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b661ae3-dbbb-427f-8bb2-6fd487c81caf {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.076036] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19bc610c-b45d-46d6-9a9e-7e975da6c3ef {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.086252] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 42920efb-be41-4813-b33e-d49c6f4fb47c could not be found. [ 625.086456] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 625.086833] env[59659]: INFO nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Took 0.05 seconds to destroy the instance on the hypervisor. [ 625.087163] env[59659]: DEBUG oslo.service.loopingcall [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 625.087934] env[59659]: DEBUG nova.compute.manager [-] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 625.088259] env[59659]: DEBUG nova.network.neutron [-] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 625.159639] env[59659]: DEBUG nova.network.neutron [-] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 625.180189] env[59659]: DEBUG nova.network.neutron [-] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 625.192464] env[59659]: INFO nova.compute.manager [-] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Took 0.10 seconds to deallocate network for instance. [ 625.194670] env[59659]: DEBUG nova.compute.claims [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 625.194892] env[59659]: DEBUG oslo_concurrency.lockutils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.195236] env[59659]: DEBUG oslo_concurrency.lockutils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.243496] env[59659]: DEBUG nova.policy [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ddd7a5cce0914c8cbd4698144cfb5be5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e35d0d53e9be49c4812e6268e521dfaf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 625.344922] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17f282b3-9260-41f9-bba3-0d98fc0f15e5 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.353214] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80f19db3-5641-4efa-9d43-00a4e097b919 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.389593] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e33884b5-d5ac-4c57-b0c1-e0ba311ea44b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.397471] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27cd3d23-bc56-42d4-898b-9b984ef4bbcb {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.411379] env[59659]: DEBUG nova.compute.provider_tree [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 625.420218] env[59659]: DEBUG nova.scheduler.client.report [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 625.438414] env[59659]: DEBUG oslo_concurrency.lockutils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.243s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.439081] env[59659]: ERROR nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 8afd2645-2674-4db5-9c6b-4581a3e8bd46, please check neutron logs for more information. [ 625.439081] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Traceback (most recent call last): [ 625.439081] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 625.439081] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] self.driver.spawn(context, instance, image_meta, [ 625.439081] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 625.439081] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 625.439081] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 625.439081] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] vm_ref = self.build_virtual_machine(instance, [ 625.439081] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 625.439081] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] vif_infos = vmwarevif.get_vif_info(self._session, [ 625.439081] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 625.440605] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] for vif in network_info: [ 625.440605] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 625.440605] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] return self._sync_wrapper(fn, *args, **kwargs) [ 625.440605] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 625.440605] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] self.wait() [ 625.440605] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 625.440605] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] self[:] = self._gt.wait() [ 625.440605] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 625.440605] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] return self._exit_event.wait() [ 625.440605] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 625.440605] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] result = hub.switch() [ 625.440605] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 625.440605] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] return self.greenlet.switch() [ 625.440605] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 625.440991] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] result = function(*args, **kwargs) [ 625.440991] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 625.440991] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] return func(*args, **kwargs) [ 625.440991] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 625.440991] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] raise e [ 625.440991] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 625.440991] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] nwinfo = self.network_api.allocate_for_instance( [ 625.440991] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 625.440991] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] created_port_ids = self._update_ports_for_instance( [ 625.440991] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 625.440991] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] with excutils.save_and_reraise_exception(): [ 625.440991] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 625.440991] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] self.force_reraise() [ 625.441327] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 625.441327] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] raise self.value [ 625.441327] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 625.441327] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] updated_port = self._update_port( [ 625.441327] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 625.441327] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] _ensure_no_port_binding_failure(port) [ 625.441327] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 625.441327] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] raise exception.PortBindingFailed(port_id=port['id']) [ 625.441327] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] nova.exception.PortBindingFailed: Binding failed for port 8afd2645-2674-4db5-9c6b-4581a3e8bd46, please check neutron logs for more information. [ 625.441327] env[59659]: ERROR nova.compute.manager [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] [ 625.441327] env[59659]: DEBUG nova.compute.utils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Binding failed for port 8afd2645-2674-4db5-9c6b-4581a3e8bd46, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 625.447219] env[59659]: DEBUG nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Build of instance 42920efb-be41-4813-b33e-d49c6f4fb47c was re-scheduled: Binding failed for port 8afd2645-2674-4db5-9c6b-4581a3e8bd46, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 625.448261] env[59659]: DEBUG nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 625.448261] env[59659]: DEBUG oslo_concurrency.lockutils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Acquiring lock "refresh_cache-42920efb-be41-4813-b33e-d49c6f4fb47c" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 625.448261] env[59659]: DEBUG oslo_concurrency.lockutils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Acquired lock "refresh_cache-42920efb-be41-4813-b33e-d49c6f4fb47c" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 625.448261] env[59659]: DEBUG nova.network.neutron [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 625.504713] env[59659]: DEBUG nova.network.neutron [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 625.932135] env[59659]: DEBUG nova.network.neutron [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 625.945227] env[59659]: DEBUG oslo_concurrency.lockutils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Releasing lock "refresh_cache-42920efb-be41-4813-b33e-d49c6f4fb47c" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 625.945227] env[59659]: DEBUG nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 625.945227] env[59659]: DEBUG nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 625.945227] env[59659]: DEBUG nova.network.neutron [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 626.020060] env[59659]: DEBUG nova.network.neutron [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 626.032488] env[59659]: DEBUG nova.network.neutron [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 626.046870] env[59659]: INFO nova.compute.manager [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] [instance: 42920efb-be41-4813-b33e-d49c6f4fb47c] Took 0.10 seconds to deallocate network for instance. [ 626.165945] env[59659]: INFO nova.scheduler.client.report [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Deleted allocations for instance 42920efb-be41-4813-b33e-d49c6f4fb47c [ 626.194552] env[59659]: DEBUG oslo_concurrency.lockutils [None req-79d3ea84-51be-4b75-9270-5e934848994d tempest-ImagesNegativeTestJSON-1415358462 tempest-ImagesNegativeTestJSON-1415358462-project-member] Lock "42920efb-be41-4813-b33e-d49c6f4fb47c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.477s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.260766] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Acquiring lock "b0f907a1-c4f4-4d02-9f07-8a640af4cdc4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.260766] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Lock "b0f907a1-c4f4-4d02-9f07-8a640af4cdc4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.274492] env[59659]: DEBUG nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 626.357169] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.357456] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.360125] env[59659]: INFO nova.compute.claims [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 626.542104] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-166a55b4-a6b0-487b-9867-19e84805c288 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.549943] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-254e3a7b-3c17-453e-841a-2829ce345953 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.583617] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd6c1a3d-b5b3-4c94-9bb0-255405872ab1 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.592555] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cc3ba44-38e6-45fd-b176-d686ea5b103c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.611688] env[59659]: DEBUG nova.compute.provider_tree [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 626.623639] env[59659]: DEBUG nova.scheduler.client.report [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 626.644761] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.285s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.644761] env[59659]: DEBUG nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 626.681946] env[59659]: DEBUG nova.compute.utils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 626.684679] env[59659]: DEBUG nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 626.684679] env[59659]: DEBUG nova.network.neutron [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 626.701680] env[59659]: DEBUG nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 626.776166] env[59659]: DEBUG nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 626.806029] env[59659]: DEBUG nova.virt.hardware [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 626.809481] env[59659]: DEBUG nova.virt.hardware [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 626.809481] env[59659]: DEBUG nova.virt.hardware [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 626.809481] env[59659]: DEBUG nova.virt.hardware [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 626.809481] env[59659]: DEBUG nova.virt.hardware [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 626.809481] env[59659]: DEBUG nova.virt.hardware [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 626.809687] env[59659]: DEBUG nova.virt.hardware [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 626.809687] env[59659]: DEBUG nova.virt.hardware [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 626.809687] env[59659]: DEBUG nova.virt.hardware [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 626.809687] env[59659]: DEBUG nova.virt.hardware [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 626.809687] env[59659]: DEBUG nova.virt.hardware [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 626.809835] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a2dda86-c2f0-4bfa-a0a2-bc708c265439 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.818613] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06ba05d0-5564-4000-9b27-861cddad3920 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.179443] env[59659]: DEBUG nova.policy [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7e6dba0da3574cf7b9102a17bf8d1a43', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6502f31a94194708b7eb572e3c9dd2d6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 627.191183] env[59659]: DEBUG nova.network.neutron [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Successfully created port: 0e4c20cd-dcc1-4b76-a581-ea92335f3e09 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 628.522379] env[59659]: DEBUG nova.network.neutron [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Successfully created port: 8ec22c7e-09a3-4c3f-bb3d-32092582ca44 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 629.609779] env[59659]: ERROR nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 7fb0cc84-bd6f-4ed9-a2a5-8e37c68b3ed1, please check neutron logs for more information. [ 629.609779] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 629.609779] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 629.609779] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 629.609779] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 629.609779] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 629.609779] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 629.609779] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 629.609779] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 629.609779] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 629.609779] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 629.609779] env[59659]: ERROR nova.compute.manager raise self.value [ 629.609779] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 629.609779] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 629.609779] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 629.609779] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 629.610437] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 629.610437] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 629.610437] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 7fb0cc84-bd6f-4ed9-a2a5-8e37c68b3ed1, please check neutron logs for more information. [ 629.610437] env[59659]: ERROR nova.compute.manager [ 629.610437] env[59659]: Traceback (most recent call last): [ 629.610437] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 629.610437] env[59659]: listener.cb(fileno) [ 629.610437] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 629.610437] env[59659]: result = function(*args, **kwargs) [ 629.610437] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 629.610437] env[59659]: return func(*args, **kwargs) [ 629.610437] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 629.610437] env[59659]: raise e [ 629.610437] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 629.610437] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 629.610437] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 629.610437] env[59659]: created_port_ids = self._update_ports_for_instance( [ 629.610437] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 629.610437] env[59659]: with excutils.save_and_reraise_exception(): [ 629.610437] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 629.610437] env[59659]: self.force_reraise() [ 629.610437] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 629.610437] env[59659]: raise self.value [ 629.610437] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 629.610437] env[59659]: updated_port = self._update_port( [ 629.610437] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 629.610437] env[59659]: _ensure_no_port_binding_failure(port) [ 629.610437] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 629.610437] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 629.611597] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 7fb0cc84-bd6f-4ed9-a2a5-8e37c68b3ed1, please check neutron logs for more information. [ 629.611597] env[59659]: Removing descriptor: 14 [ 629.611597] env[59659]: ERROR nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 7fb0cc84-bd6f-4ed9-a2a5-8e37c68b3ed1, please check neutron logs for more information. [ 629.611597] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Traceback (most recent call last): [ 629.611597] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 629.611597] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] yield resources [ 629.611597] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 629.611597] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] self.driver.spawn(context, instance, image_meta, [ 629.611597] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 629.611597] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 629.611597] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 629.611597] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] vm_ref = self.build_virtual_machine(instance, [ 629.611904] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 629.611904] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] vif_infos = vmwarevif.get_vif_info(self._session, [ 629.611904] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 629.611904] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] for vif in network_info: [ 629.611904] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 629.611904] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] return self._sync_wrapper(fn, *args, **kwargs) [ 629.611904] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 629.611904] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] self.wait() [ 629.611904] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 629.611904] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] self[:] = self._gt.wait() [ 629.611904] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 629.611904] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] return self._exit_event.wait() [ 629.611904] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 629.611904] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] result = hub.switch() [ 629.612244] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 629.612244] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] return self.greenlet.switch() [ 629.612244] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 629.612244] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] result = function(*args, **kwargs) [ 629.612244] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 629.612244] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] return func(*args, **kwargs) [ 629.612244] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 629.612244] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] raise e [ 629.612244] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 629.612244] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] nwinfo = self.network_api.allocate_for_instance( [ 629.612244] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 629.612244] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] created_port_ids = self._update_ports_for_instance( [ 629.612244] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 629.612583] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] with excutils.save_and_reraise_exception(): [ 629.612583] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 629.612583] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] self.force_reraise() [ 629.612583] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 629.612583] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] raise self.value [ 629.612583] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 629.612583] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] updated_port = self._update_port( [ 629.612583] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 629.612583] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] _ensure_no_port_binding_failure(port) [ 629.612583] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 629.612583] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] raise exception.PortBindingFailed(port_id=port['id']) [ 629.612583] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] nova.exception.PortBindingFailed: Binding failed for port 7fb0cc84-bd6f-4ed9-a2a5-8e37c68b3ed1, please check neutron logs for more information. [ 629.612583] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] [ 629.612923] env[59659]: INFO nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Terminating instance [ 629.614027] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Acquiring lock "refresh_cache-75398340-5ec7-4e3f-abc1-602f838d7ef3" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 629.614027] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Acquired lock "refresh_cache-75398340-5ec7-4e3f-abc1-602f838d7ef3" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 629.614027] env[59659]: DEBUG nova.network.neutron [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 629.689135] env[59659]: DEBUG nova.network.neutron [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 629.925265] env[59659]: DEBUG nova.compute.manager [req-0210e9bf-6bf4-4ad2-927b-837e19c445f0 req-1dd4445a-76c9-4e9f-a1a3-3caae4b28c94 service nova] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Received event network-changed-7fb0cc84-bd6f-4ed9-a2a5-8e37c68b3ed1 {{(pid=59659) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 629.925850] env[59659]: DEBUG nova.compute.manager [req-0210e9bf-6bf4-4ad2-927b-837e19c445f0 req-1dd4445a-76c9-4e9f-a1a3-3caae4b28c94 service nova] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Refreshing instance network info cache due to event network-changed-7fb0cc84-bd6f-4ed9-a2a5-8e37c68b3ed1. {{(pid=59659) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 629.925850] env[59659]: DEBUG oslo_concurrency.lockutils [req-0210e9bf-6bf4-4ad2-927b-837e19c445f0 req-1dd4445a-76c9-4e9f-a1a3-3caae4b28c94 service nova] Acquiring lock "refresh_cache-75398340-5ec7-4e3f-abc1-602f838d7ef3" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 630.017264] env[59659]: DEBUG nova.network.neutron [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 630.029717] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Releasing lock "refresh_cache-75398340-5ec7-4e3f-abc1-602f838d7ef3" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 630.031248] env[59659]: DEBUG nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 630.031345] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 630.032458] env[59659]: DEBUG oslo_concurrency.lockutils [req-0210e9bf-6bf4-4ad2-927b-837e19c445f0 req-1dd4445a-76c9-4e9f-a1a3-3caae4b28c94 service nova] Acquired lock "refresh_cache-75398340-5ec7-4e3f-abc1-602f838d7ef3" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 630.032664] env[59659]: DEBUG nova.network.neutron [req-0210e9bf-6bf4-4ad2-927b-837e19c445f0 req-1dd4445a-76c9-4e9f-a1a3-3caae4b28c94 service nova] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Refreshing network info cache for port 7fb0cc84-bd6f-4ed9-a2a5-8e37c68b3ed1 {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 630.033898] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6df595d3-2b98-4ba8-bdc7-a5692a70dafb {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.046828] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c30b7ff4-9ed2-4f4d-83f2-7bca2c82573c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.077027] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 75398340-5ec7-4e3f-abc1-602f838d7ef3 could not be found. [ 630.077312] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 630.077589] env[59659]: INFO nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Took 0.05 seconds to destroy the instance on the hypervisor. [ 630.077876] env[59659]: DEBUG oslo.service.loopingcall [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 630.078141] env[59659]: DEBUG nova.compute.manager [-] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 630.078286] env[59659]: DEBUG nova.network.neutron [-] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 630.114468] env[59659]: DEBUG nova.network.neutron [req-0210e9bf-6bf4-4ad2-927b-837e19c445f0 req-1dd4445a-76c9-4e9f-a1a3-3caae4b28c94 service nova] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 630.123032] env[59659]: DEBUG nova.network.neutron [-] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 630.135064] env[59659]: DEBUG nova.network.neutron [-] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 630.145369] env[59659]: INFO nova.compute.manager [-] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Took 0.07 seconds to deallocate network for instance. [ 630.147199] env[59659]: DEBUG nova.compute.claims [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 630.147652] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.147988] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 630.189401] env[59659]: ERROR nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port e7aa5e57-f725-4a6c-a6f6-93fb3aad32ea, please check neutron logs for more information. [ 630.189401] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 630.189401] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 630.189401] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 630.189401] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 630.189401] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 630.189401] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 630.189401] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 630.189401] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 630.189401] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 630.189401] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 630.189401] env[59659]: ERROR nova.compute.manager raise self.value [ 630.189401] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 630.189401] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 630.189401] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 630.189401] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 630.189873] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 630.189873] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 630.189873] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port e7aa5e57-f725-4a6c-a6f6-93fb3aad32ea, please check neutron logs for more information. [ 630.189873] env[59659]: ERROR nova.compute.manager [ 630.190652] env[59659]: Traceback (most recent call last): [ 630.190652] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 630.190652] env[59659]: listener.cb(fileno) [ 630.190652] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 630.190652] env[59659]: result = function(*args, **kwargs) [ 630.190652] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 630.190652] env[59659]: return func(*args, **kwargs) [ 630.190652] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 630.190652] env[59659]: raise e [ 630.190652] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 630.190652] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 630.190652] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 630.190652] env[59659]: created_port_ids = self._update_ports_for_instance( [ 630.190652] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 630.190652] env[59659]: with excutils.save_and_reraise_exception(): [ 630.190652] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 630.190652] env[59659]: self.force_reraise() [ 630.190652] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 630.190652] env[59659]: raise self.value [ 630.190652] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 630.190652] env[59659]: updated_port = self._update_port( [ 630.190652] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 630.190652] env[59659]: _ensure_no_port_binding_failure(port) [ 630.190652] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 630.190652] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 630.190652] env[59659]: nova.exception.PortBindingFailed: Binding failed for port e7aa5e57-f725-4a6c-a6f6-93fb3aad32ea, please check neutron logs for more information. [ 630.190652] env[59659]: Removing descriptor: 15 [ 630.193861] env[59659]: ERROR nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port e7aa5e57-f725-4a6c-a6f6-93fb3aad32ea, please check neutron logs for more information. [ 630.193861] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Traceback (most recent call last): [ 630.193861] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 630.193861] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] yield resources [ 630.193861] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 630.193861] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] self.driver.spawn(context, instance, image_meta, [ 630.193861] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 630.193861] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 630.193861] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 630.193861] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] vm_ref = self.build_virtual_machine(instance, [ 630.193861] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 630.194292] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] vif_infos = vmwarevif.get_vif_info(self._session, [ 630.194292] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 630.194292] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] for vif in network_info: [ 630.194292] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 630.194292] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] return self._sync_wrapper(fn, *args, **kwargs) [ 630.194292] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 630.194292] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] self.wait() [ 630.194292] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 630.194292] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] self[:] = self._gt.wait() [ 630.194292] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 630.194292] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] return self._exit_event.wait() [ 630.194292] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 630.194292] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] result = hub.switch() [ 630.194292] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 630.194633] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] return self.greenlet.switch() [ 630.194633] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 630.194633] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] result = function(*args, **kwargs) [ 630.194633] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 630.194633] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] return func(*args, **kwargs) [ 630.194633] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 630.194633] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] raise e [ 630.194633] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 630.194633] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] nwinfo = self.network_api.allocate_for_instance( [ 630.194633] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 630.194633] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] created_port_ids = self._update_ports_for_instance( [ 630.194633] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 630.194633] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] with excutils.save_and_reraise_exception(): [ 630.194937] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 630.194937] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] self.force_reraise() [ 630.194937] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 630.194937] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] raise self.value [ 630.194937] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 630.194937] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] updated_port = self._update_port( [ 630.194937] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 630.194937] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] _ensure_no_port_binding_failure(port) [ 630.194937] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 630.194937] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] raise exception.PortBindingFailed(port_id=port['id']) [ 630.194937] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] nova.exception.PortBindingFailed: Binding failed for port e7aa5e57-f725-4a6c-a6f6-93fb3aad32ea, please check neutron logs for more information. [ 630.194937] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] [ 630.195240] env[59659]: INFO nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Terminating instance [ 630.201278] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Acquiring lock "refresh_cache-5e7637fe-8828-4c16-a629-0d82f1efded9" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 630.202067] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Acquired lock "refresh_cache-5e7637fe-8828-4c16-a629-0d82f1efded9" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 630.202323] env[59659]: DEBUG nova.network.neutron [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 630.238070] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._sync_power_states {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.256721] env[59659]: DEBUG nova.network.neutron [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 630.262845] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Getting list of instances from cluster (obj){ [ 630.262845] env[59659]: value = "domain-c8" [ 630.262845] env[59659]: _type = "ClusterComputeResource" [ 630.262845] env[59659]: } {{(pid=59659) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 630.266110] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d261a5db-f81a-415f-b181-55fdaeed5ca9 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.280144] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Got total of 0 instances {{(pid=59659) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 630.280144] env[59659]: WARNING nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] While synchronizing instance power states, found 3 instances in the database and 0 instances on the hypervisor. [ 630.280144] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Triggering sync for uuid 5e7637fe-8828-4c16-a629-0d82f1efded9 {{(pid=59659) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 630.280144] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Triggering sync for uuid 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a {{(pid=59659) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 630.280144] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Triggering sync for uuid b0f907a1-c4f4-4d02-9f07-8a640af4cdc4 {{(pid=59659) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 630.280741] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Acquiring lock "5e7637fe-8828-4c16-a629-0d82f1efded9" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.281334] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Acquiring lock "4cdfac2f-6963-4f71-9a42-709f2eeb4f9a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.284016] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Acquiring lock "b0f907a1-c4f4-4d02-9f07-8a640af4cdc4" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.284016] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.284016] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Getting list of instances from cluster (obj){ [ 630.284016] env[59659]: value = "domain-c8" [ 630.284016] env[59659]: _type = "ClusterComputeResource" [ 630.284016] env[59659]: } {{(pid=59659) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 630.284016] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-959e65b4-a2f3-4cbe-b6fa-c19e64a114c9 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.297401] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Got total of 0 instances {{(pid=59659) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 630.309899] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17eba638-ed68-46d1-a1a6-22f349054a8a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.319910] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74f148c8-fee8-4c6f-a692-43c52841a68d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.354363] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abcaff96-fa9d-4651-8cdc-51b39ee1228d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.366287] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3abd92d-e03e-48c3-9abf-4463c8345165 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.383709] env[59659]: DEBUG nova.compute.provider_tree [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 630.392107] env[59659]: DEBUG nova.scheduler.client.report [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 630.406827] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.259s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 630.407445] env[59659]: ERROR nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 7fb0cc84-bd6f-4ed9-a2a5-8e37c68b3ed1, please check neutron logs for more information. [ 630.407445] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Traceback (most recent call last): [ 630.407445] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 630.407445] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] self.driver.spawn(context, instance, image_meta, [ 630.407445] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 630.407445] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 630.407445] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 630.407445] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] vm_ref = self.build_virtual_machine(instance, [ 630.407445] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 630.407445] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] vif_infos = vmwarevif.get_vif_info(self._session, [ 630.407445] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 630.407751] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] for vif in network_info: [ 630.407751] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 630.407751] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] return self._sync_wrapper(fn, *args, **kwargs) [ 630.407751] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 630.407751] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] self.wait() [ 630.407751] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 630.407751] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] self[:] = self._gt.wait() [ 630.407751] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 630.407751] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] return self._exit_event.wait() [ 630.407751] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 630.407751] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] result = hub.switch() [ 630.407751] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 630.407751] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] return self.greenlet.switch() [ 630.407751] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 630.408090] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] result = function(*args, **kwargs) [ 630.408090] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 630.408090] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] return func(*args, **kwargs) [ 630.408090] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 630.408090] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] raise e [ 630.408090] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 630.408090] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] nwinfo = self.network_api.allocate_for_instance( [ 630.408090] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 630.408090] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] created_port_ids = self._update_ports_for_instance( [ 630.408090] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 630.408090] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] with excutils.save_and_reraise_exception(): [ 630.408090] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 630.408090] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] self.force_reraise() [ 630.408386] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 630.408386] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] raise self.value [ 630.408386] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 630.408386] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] updated_port = self._update_port( [ 630.408386] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 630.408386] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] _ensure_no_port_binding_failure(port) [ 630.408386] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 630.408386] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] raise exception.PortBindingFailed(port_id=port['id']) [ 630.408386] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] nova.exception.PortBindingFailed: Binding failed for port 7fb0cc84-bd6f-4ed9-a2a5-8e37c68b3ed1, please check neutron logs for more information. [ 630.408386] env[59659]: ERROR nova.compute.manager [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] [ 630.408386] env[59659]: DEBUG nova.compute.utils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Binding failed for port 7fb0cc84-bd6f-4ed9-a2a5-8e37c68b3ed1, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 630.409637] env[59659]: DEBUG nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Build of instance 75398340-5ec7-4e3f-abc1-602f838d7ef3 was re-scheduled: Binding failed for port 7fb0cc84-bd6f-4ed9-a2a5-8e37c68b3ed1, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 630.410017] env[59659]: DEBUG nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 630.410214] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Acquiring lock "refresh_cache-75398340-5ec7-4e3f-abc1-602f838d7ef3" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 630.524912] env[59659]: DEBUG nova.network.neutron [req-0210e9bf-6bf4-4ad2-927b-837e19c445f0 req-1dd4445a-76c9-4e9f-a1a3-3caae4b28c94 service nova] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 630.535396] env[59659]: DEBUG oslo_concurrency.lockutils [req-0210e9bf-6bf4-4ad2-927b-837e19c445f0 req-1dd4445a-76c9-4e9f-a1a3-3caae4b28c94 service nova] Releasing lock "refresh_cache-75398340-5ec7-4e3f-abc1-602f838d7ef3" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 630.536832] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Acquired lock "refresh_cache-75398340-5ec7-4e3f-abc1-602f838d7ef3" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 630.536832] env[59659]: DEBUG nova.network.neutron [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 630.618582] env[59659]: DEBUG nova.network.neutron [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 630.715378] env[59659]: DEBUG nova.network.neutron [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 630.728106] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Releasing lock "refresh_cache-5e7637fe-8828-4c16-a629-0d82f1efded9" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 630.728553] env[59659]: DEBUG nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 630.728766] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 630.729458] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1dcfd5b0-ed94-46be-bdd2-475a07dc6e23 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.741478] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25e13abc-a080-468d-b064-e2db830fe5a9 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.764951] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5e7637fe-8828-4c16-a629-0d82f1efded9 could not be found. [ 630.765203] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 630.765382] env[59659]: INFO nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 630.765620] env[59659]: DEBUG oslo.service.loopingcall [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 630.765825] env[59659]: DEBUG nova.compute.manager [-] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 630.765919] env[59659]: DEBUG nova.network.neutron [-] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 630.831575] env[59659]: DEBUG nova.network.neutron [-] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 630.841548] env[59659]: DEBUG nova.network.neutron [-] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 630.852983] env[59659]: INFO nova.compute.manager [-] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Took 0.09 seconds to deallocate network for instance. [ 630.856082] env[59659]: DEBUG nova.compute.claims [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 630.856359] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.856528] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 630.892594] env[59659]: DEBUG nova.network.neutron [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 630.909165] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Releasing lock "refresh_cache-75398340-5ec7-4e3f-abc1-602f838d7ef3" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 630.909165] env[59659]: DEBUG nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 630.909165] env[59659]: DEBUG nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 630.909165] env[59659]: DEBUG nova.network.neutron [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 630.956484] env[59659]: DEBUG nova.network.neutron [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 630.975997] env[59659]: DEBUG nova.network.neutron [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 630.988780] env[59659]: INFO nova.compute.manager [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] [instance: 75398340-5ec7-4e3f-abc1-602f838d7ef3] Took 0.08 seconds to deallocate network for instance. [ 630.994939] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-894a6f9f-2e67-46ad-ad04-809513d860fa {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.004941] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b02880ae-1c49-4291-9c7a-5f001439c662 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.044093] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbe97226-fcdf-4acc-9673-d86b6532d6ad {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.054654] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-747ecc51-1a11-41b3-b6e3-d66182da1ba5 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.067859] env[59659]: DEBUG nova.compute.provider_tree [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 631.084403] env[59659]: DEBUG nova.scheduler.client.report [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 631.101628] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.245s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.102377] env[59659]: ERROR nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port e7aa5e57-f725-4a6c-a6f6-93fb3aad32ea, please check neutron logs for more information. [ 631.102377] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Traceback (most recent call last): [ 631.102377] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 631.102377] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] self.driver.spawn(context, instance, image_meta, [ 631.102377] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 631.102377] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 631.102377] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 631.102377] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] vm_ref = self.build_virtual_machine(instance, [ 631.102377] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 631.102377] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] vif_infos = vmwarevif.get_vif_info(self._session, [ 631.102377] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 631.102797] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] for vif in network_info: [ 631.102797] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 631.102797] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] return self._sync_wrapper(fn, *args, **kwargs) [ 631.102797] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 631.102797] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] self.wait() [ 631.102797] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 631.102797] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] self[:] = self._gt.wait() [ 631.102797] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 631.102797] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] return self._exit_event.wait() [ 631.102797] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 631.102797] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] result = hub.switch() [ 631.102797] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 631.102797] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] return self.greenlet.switch() [ 631.102797] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 631.103371] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] result = function(*args, **kwargs) [ 631.103371] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 631.103371] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] return func(*args, **kwargs) [ 631.103371] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 631.103371] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] raise e [ 631.103371] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 631.103371] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] nwinfo = self.network_api.allocate_for_instance( [ 631.103371] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 631.103371] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] created_port_ids = self._update_ports_for_instance( [ 631.103371] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 631.103371] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] with excutils.save_and_reraise_exception(): [ 631.103371] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 631.103371] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] self.force_reraise() [ 631.104055] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 631.104055] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] raise self.value [ 631.104055] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 631.104055] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] updated_port = self._update_port( [ 631.104055] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 631.104055] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] _ensure_no_port_binding_failure(port) [ 631.104055] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 631.104055] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] raise exception.PortBindingFailed(port_id=port['id']) [ 631.104055] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] nova.exception.PortBindingFailed: Binding failed for port e7aa5e57-f725-4a6c-a6f6-93fb3aad32ea, please check neutron logs for more information. [ 631.104055] env[59659]: ERROR nova.compute.manager [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] [ 631.104410] env[59659]: DEBUG nova.compute.utils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Binding failed for port e7aa5e57-f725-4a6c-a6f6-93fb3aad32ea, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 631.104896] env[59659]: DEBUG nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Build of instance 5e7637fe-8828-4c16-a629-0d82f1efded9 was re-scheduled: Binding failed for port e7aa5e57-f725-4a6c-a6f6-93fb3aad32ea, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 631.105327] env[59659]: DEBUG nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 631.105545] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Acquiring lock "refresh_cache-5e7637fe-8828-4c16-a629-0d82f1efded9" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 631.105686] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Acquired lock "refresh_cache-5e7637fe-8828-4c16-a629-0d82f1efded9" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 631.105841] env[59659]: DEBUG nova.network.neutron [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 631.117105] env[59659]: INFO nova.scheduler.client.report [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Deleted allocations for instance 75398340-5ec7-4e3f-abc1-602f838d7ef3 [ 631.153487] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ad1b777c-d99f-48ed-9fd1-ccc8fb9bba27 tempest-MigrationsAdminTest-1048639197 tempest-MigrationsAdminTest-1048639197-project-member] Lock "75398340-5ec7-4e3f-abc1-602f838d7ef3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.065s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.190031] env[59659]: DEBUG nova.network.neutron [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 631.331352] env[59659]: DEBUG oslo_concurrency.lockutils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Acquiring lock "4a21b251-816d-4668-9a2e-eeabd9ed347b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.331352] env[59659]: DEBUG oslo_concurrency.lockutils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Lock "4a21b251-816d-4668-9a2e-eeabd9ed347b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.349114] env[59659]: DEBUG nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 631.416018] env[59659]: DEBUG oslo_concurrency.lockutils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.416018] env[59659]: DEBUG oslo_concurrency.lockutils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.416018] env[59659]: INFO nova.compute.claims [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 631.523295] env[59659]: DEBUG nova.network.neutron [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 631.537764] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Releasing lock "refresh_cache-5e7637fe-8828-4c16-a629-0d82f1efded9" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 631.537764] env[59659]: DEBUG nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 631.537764] env[59659]: DEBUG nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 631.537764] env[59659]: DEBUG nova.network.neutron [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 631.578350] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a3d44fc-0e9b-4034-b6a3-212a6012b11f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.594113] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f95f353d-cfa0-4214-a302-962ba44ac76d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.628565] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2efa8aaa-57e2-4dac-94b9-0476a3ea8528 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.637956] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53d0047d-4a69-4531-8491-97f05902dd5d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.652051] env[59659]: DEBUG nova.compute.provider_tree [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 631.669082] env[59659]: DEBUG nova.scheduler.client.report [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 631.682766] env[59659]: DEBUG oslo_concurrency.lockutils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.268s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.683280] env[59659]: DEBUG nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 631.729136] env[59659]: DEBUG nova.compute.utils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 631.730888] env[59659]: DEBUG nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 631.730985] env[59659]: DEBUG nova.network.neutron [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 631.749217] env[59659]: DEBUG nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 631.821786] env[59659]: DEBUG nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 631.830769] env[59659]: DEBUG nova.network.neutron [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 631.843797] env[59659]: DEBUG nova.network.neutron [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 631.851280] env[59659]: DEBUG nova.virt.hardware [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 631.851506] env[59659]: DEBUG nova.virt.hardware [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 631.851648] env[59659]: DEBUG nova.virt.hardware [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 631.851828] env[59659]: DEBUG nova.virt.hardware [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 631.851961] env[59659]: DEBUG nova.virt.hardware [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 631.852105] env[59659]: DEBUG nova.virt.hardware [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 631.852300] env[59659]: DEBUG nova.virt.hardware [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 631.852441] env[59659]: DEBUG nova.virt.hardware [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 631.852688] env[59659]: DEBUG nova.virt.hardware [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 631.852754] env[59659]: DEBUG nova.virt.hardware [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 631.852931] env[59659]: DEBUG nova.virt.hardware [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 631.858652] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f87ec56d-c65c-4b1a-a562-e45e0f39bc40 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.864117] env[59659]: INFO nova.compute.manager [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] Took 0.33 seconds to deallocate network for instance. [ 631.875860] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8e49bbc-bf19-44d0-be5d-ffd44e5d39c8 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.987479] env[59659]: INFO nova.scheduler.client.report [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Deleted allocations for instance 5e7637fe-8828-4c16-a629-0d82f1efded9 [ 632.009845] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2f705d30-d0d0-4cac-8f43-0110043792d2 tempest-ServerDiagnosticsNegativeTest-1416714911 tempest-ServerDiagnosticsNegativeTest-1416714911-project-member] Lock "5e7637fe-8828-4c16-a629-0d82f1efded9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.979s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 632.010358] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "5e7637fe-8828-4c16-a629-0d82f1efded9" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 1.729s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 632.010358] env[59659]: INFO nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] [instance: 5e7637fe-8828-4c16-a629-0d82f1efded9] During sync_power_state the instance has a pending task (spawning). Skip. [ 632.010505] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "5e7637fe-8828-4c16-a629-0d82f1efded9" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 632.017385] env[59659]: DEBUG nova.policy [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e12c72b338434c81bc50996b0638cee2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c58ea55e246943a083fa2eb0e98cb0c1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 632.240525] env[59659]: ERROR nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 0e4c20cd-dcc1-4b76-a581-ea92335f3e09, please check neutron logs for more information. [ 632.240525] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 632.240525] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 632.240525] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 632.240525] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 632.240525] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 632.240525] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 632.240525] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 632.240525] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 632.240525] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 632.240525] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 632.240525] env[59659]: ERROR nova.compute.manager raise self.value [ 632.240525] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 632.240525] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 632.240525] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 632.240525] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 632.241517] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 632.241517] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 632.241517] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 0e4c20cd-dcc1-4b76-a581-ea92335f3e09, please check neutron logs for more information. [ 632.241517] env[59659]: ERROR nova.compute.manager [ 632.241517] env[59659]: Traceback (most recent call last): [ 632.241517] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 632.241517] env[59659]: listener.cb(fileno) [ 632.241517] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 632.241517] env[59659]: result = function(*args, **kwargs) [ 632.241517] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 632.241517] env[59659]: return func(*args, **kwargs) [ 632.241517] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 632.241517] env[59659]: raise e [ 632.241517] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 632.241517] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 632.241517] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 632.241517] env[59659]: created_port_ids = self._update_ports_for_instance( [ 632.241517] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 632.241517] env[59659]: with excutils.save_and_reraise_exception(): [ 632.241517] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 632.241517] env[59659]: self.force_reraise() [ 632.241517] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 632.241517] env[59659]: raise self.value [ 632.241517] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 632.241517] env[59659]: updated_port = self._update_port( [ 632.241517] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 632.241517] env[59659]: _ensure_no_port_binding_failure(port) [ 632.241517] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 632.241517] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 632.242576] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 0e4c20cd-dcc1-4b76-a581-ea92335f3e09, please check neutron logs for more information. [ 632.242576] env[59659]: Removing descriptor: 16 [ 632.242576] env[59659]: ERROR nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 0e4c20cd-dcc1-4b76-a581-ea92335f3e09, please check neutron logs for more information. [ 632.242576] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Traceback (most recent call last): [ 632.242576] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 632.242576] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] yield resources [ 632.242576] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 632.242576] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] self.driver.spawn(context, instance, image_meta, [ 632.242576] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 632.242576] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 632.242576] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 632.242576] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] vm_ref = self.build_virtual_machine(instance, [ 632.242937] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 632.242937] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] vif_infos = vmwarevif.get_vif_info(self._session, [ 632.242937] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 632.242937] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] for vif in network_info: [ 632.242937] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 632.242937] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] return self._sync_wrapper(fn, *args, **kwargs) [ 632.242937] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 632.242937] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] self.wait() [ 632.242937] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 632.242937] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] self[:] = self._gt.wait() [ 632.242937] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 632.242937] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] return self._exit_event.wait() [ 632.242937] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 632.242937] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] result = hub.switch() [ 632.243463] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 632.243463] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] return self.greenlet.switch() [ 632.243463] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 632.243463] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] result = function(*args, **kwargs) [ 632.243463] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 632.243463] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] return func(*args, **kwargs) [ 632.243463] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 632.243463] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] raise e [ 632.243463] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 632.243463] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] nwinfo = self.network_api.allocate_for_instance( [ 632.243463] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 632.243463] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] created_port_ids = self._update_ports_for_instance( [ 632.243463] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 632.243855] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] with excutils.save_and_reraise_exception(): [ 632.243855] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 632.243855] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] self.force_reraise() [ 632.243855] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 632.243855] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] raise self.value [ 632.243855] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 632.243855] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] updated_port = self._update_port( [ 632.243855] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 632.243855] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] _ensure_no_port_binding_failure(port) [ 632.243855] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 632.243855] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] raise exception.PortBindingFailed(port_id=port['id']) [ 632.243855] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] nova.exception.PortBindingFailed: Binding failed for port 0e4c20cd-dcc1-4b76-a581-ea92335f3e09, please check neutron logs for more information. [ 632.243855] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] [ 632.244363] env[59659]: INFO nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Terminating instance [ 632.244363] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Acquiring lock "refresh_cache-4cdfac2f-6963-4f71-9a42-709f2eeb4f9a" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 632.244363] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Acquired lock "refresh_cache-4cdfac2f-6963-4f71-9a42-709f2eeb4f9a" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 632.244363] env[59659]: DEBUG nova.network.neutron [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 632.277368] env[59659]: DEBUG nova.network.neutron [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 632.427675] env[59659]: DEBUG nova.network.neutron [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 632.439594] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Releasing lock "refresh_cache-4cdfac2f-6963-4f71-9a42-709f2eeb4f9a" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 632.440065] env[59659]: DEBUG nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 632.440295] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 632.440835] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c29f0ead-2351-473c-87d2-010aa76ce8e3 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.453522] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26473b51-a8ea-43fa-8895-677711bf19f9 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.479749] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a could not be found. [ 632.479978] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 632.480168] env[59659]: INFO nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 632.480404] env[59659]: DEBUG oslo.service.loopingcall [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 632.480605] env[59659]: DEBUG nova.compute.manager [-] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 632.481322] env[59659]: DEBUG nova.network.neutron [-] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 632.527122] env[59659]: DEBUG nova.network.neutron [-] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 632.539856] env[59659]: DEBUG nova.network.neutron [-] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 632.555515] env[59659]: INFO nova.compute.manager [-] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Took 0.07 seconds to deallocate network for instance. [ 632.559413] env[59659]: DEBUG nova.compute.claims [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 632.559654] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 632.559910] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 632.669229] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8f532a5-5d64-4229-ba2a-ce5075654b68 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.680949] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c5a1e0d-9c15-499a-b6ad-423ea352f816 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.717850] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c3583ff-691f-45d4-ad0c-55ed1789d04d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.725959] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-075e4962-b794-4385-a2d0-e53c9160dd40 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.746470] env[59659]: DEBUG nova.compute.provider_tree [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 632.763023] env[59659]: DEBUG nova.scheduler.client.report [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 632.785021] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.222s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 632.785021] env[59659]: ERROR nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 0e4c20cd-dcc1-4b76-a581-ea92335f3e09, please check neutron logs for more information. [ 632.785021] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Traceback (most recent call last): [ 632.785021] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 632.785021] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] self.driver.spawn(context, instance, image_meta, [ 632.785021] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 632.785021] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 632.785021] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 632.785021] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] vm_ref = self.build_virtual_machine(instance, [ 632.785421] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 632.785421] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] vif_infos = vmwarevif.get_vif_info(self._session, [ 632.785421] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 632.785421] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] for vif in network_info: [ 632.785421] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 632.785421] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] return self._sync_wrapper(fn, *args, **kwargs) [ 632.785421] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 632.785421] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] self.wait() [ 632.785421] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 632.785421] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] self[:] = self._gt.wait() [ 632.785421] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 632.785421] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] return self._exit_event.wait() [ 632.785421] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 632.785421] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] result = hub.switch() [ 632.785747] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 632.785747] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] return self.greenlet.switch() [ 632.785747] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 632.785747] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] result = function(*args, **kwargs) [ 632.785747] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 632.785747] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] return func(*args, **kwargs) [ 632.785747] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 632.785747] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] raise e [ 632.785747] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 632.785747] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] nwinfo = self.network_api.allocate_for_instance( [ 632.785747] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 632.785747] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] created_port_ids = self._update_ports_for_instance( [ 632.785747] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 632.786065] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] with excutils.save_and_reraise_exception(): [ 632.786065] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 632.786065] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] self.force_reraise() [ 632.786065] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 632.786065] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] raise self.value [ 632.786065] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 632.786065] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] updated_port = self._update_port( [ 632.786065] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 632.786065] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] _ensure_no_port_binding_failure(port) [ 632.786065] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 632.786065] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] raise exception.PortBindingFailed(port_id=port['id']) [ 632.786065] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] nova.exception.PortBindingFailed: Binding failed for port 0e4c20cd-dcc1-4b76-a581-ea92335f3e09, please check neutron logs for more information. [ 632.786065] env[59659]: ERROR nova.compute.manager [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] [ 632.786374] env[59659]: DEBUG nova.compute.utils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Binding failed for port 0e4c20cd-dcc1-4b76-a581-ea92335f3e09, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 632.786700] env[59659]: DEBUG nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Build of instance 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a was re-scheduled: Binding failed for port 0e4c20cd-dcc1-4b76-a581-ea92335f3e09, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 632.787170] env[59659]: DEBUG nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 632.787384] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Acquiring lock "refresh_cache-4cdfac2f-6963-4f71-9a42-709f2eeb4f9a" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 632.787519] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Acquired lock "refresh_cache-4cdfac2f-6963-4f71-9a42-709f2eeb4f9a" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 632.787665] env[59659]: DEBUG nova.network.neutron [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 632.832524] env[59659]: DEBUG nova.network.neutron [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 633.065662] env[59659]: DEBUG nova.network.neutron [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.071133] env[59659]: ERROR nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 8ec22c7e-09a3-4c3f-bb3d-32092582ca44, please check neutron logs for more information. [ 633.071133] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 633.071133] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 633.071133] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 633.071133] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 633.071133] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 633.071133] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 633.071133] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 633.071133] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 633.071133] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 633.071133] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 633.071133] env[59659]: ERROR nova.compute.manager raise self.value [ 633.071133] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 633.071133] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 633.071133] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 633.071133] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 633.071617] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 633.071617] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 633.071617] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 8ec22c7e-09a3-4c3f-bb3d-32092582ca44, please check neutron logs for more information. [ 633.071617] env[59659]: ERROR nova.compute.manager [ 633.071617] env[59659]: Traceback (most recent call last): [ 633.071617] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 633.071617] env[59659]: listener.cb(fileno) [ 633.071617] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 633.071617] env[59659]: result = function(*args, **kwargs) [ 633.071617] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 633.071617] env[59659]: return func(*args, **kwargs) [ 633.071617] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 633.071617] env[59659]: raise e [ 633.071617] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 633.071617] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 633.071617] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 633.071617] env[59659]: created_port_ids = self._update_ports_for_instance( [ 633.071617] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 633.071617] env[59659]: with excutils.save_and_reraise_exception(): [ 633.071617] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 633.071617] env[59659]: self.force_reraise() [ 633.071617] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 633.071617] env[59659]: raise self.value [ 633.071617] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 633.071617] env[59659]: updated_port = self._update_port( [ 633.071617] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 633.071617] env[59659]: _ensure_no_port_binding_failure(port) [ 633.071617] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 633.071617] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 633.072325] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 8ec22c7e-09a3-4c3f-bb3d-32092582ca44, please check neutron logs for more information. [ 633.072325] env[59659]: Removing descriptor: 12 [ 633.072325] env[59659]: ERROR nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 8ec22c7e-09a3-4c3f-bb3d-32092582ca44, please check neutron logs for more information. [ 633.072325] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Traceback (most recent call last): [ 633.072325] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 633.072325] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] yield resources [ 633.072325] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 633.072325] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] self.driver.spawn(context, instance, image_meta, [ 633.072325] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 633.072325] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 633.072325] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 633.072325] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] vm_ref = self.build_virtual_machine(instance, [ 633.072624] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 633.072624] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] vif_infos = vmwarevif.get_vif_info(self._session, [ 633.072624] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 633.072624] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] for vif in network_info: [ 633.072624] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 633.072624] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] return self._sync_wrapper(fn, *args, **kwargs) [ 633.072624] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 633.072624] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] self.wait() [ 633.072624] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 633.072624] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] self[:] = self._gt.wait() [ 633.072624] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 633.072624] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] return self._exit_event.wait() [ 633.072624] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 633.072624] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] result = hub.switch() [ 633.072960] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 633.072960] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] return self.greenlet.switch() [ 633.072960] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 633.072960] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] result = function(*args, **kwargs) [ 633.072960] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 633.072960] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] return func(*args, **kwargs) [ 633.072960] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 633.072960] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] raise e [ 633.072960] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 633.072960] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] nwinfo = self.network_api.allocate_for_instance( [ 633.072960] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 633.072960] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] created_port_ids = self._update_ports_for_instance( [ 633.072960] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 633.073271] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] with excutils.save_and_reraise_exception(): [ 633.073271] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 633.073271] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] self.force_reraise() [ 633.073271] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 633.073271] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] raise self.value [ 633.073271] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 633.073271] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] updated_port = self._update_port( [ 633.073271] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 633.073271] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] _ensure_no_port_binding_failure(port) [ 633.073271] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 633.073271] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] raise exception.PortBindingFailed(port_id=port['id']) [ 633.073271] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] nova.exception.PortBindingFailed: Binding failed for port 8ec22c7e-09a3-4c3f-bb3d-32092582ca44, please check neutron logs for more information. [ 633.073271] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] [ 633.073571] env[59659]: INFO nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Terminating instance [ 633.076375] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Acquiring lock "refresh_cache-b0f907a1-c4f4-4d02-9f07-8a640af4cdc4" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 633.076654] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Acquired lock "refresh_cache-b0f907a1-c4f4-4d02-9f07-8a640af4cdc4" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 633.076940] env[59659]: DEBUG nova.network.neutron [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 633.079175] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Releasing lock "refresh_cache-4cdfac2f-6963-4f71-9a42-709f2eeb4f9a" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 633.079470] env[59659]: DEBUG nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 633.079717] env[59659]: DEBUG nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 633.080137] env[59659]: DEBUG nova.network.neutron [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 633.273543] env[59659]: DEBUG nova.network.neutron [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 633.279775] env[59659]: DEBUG nova.network.neutron [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 633.286913] env[59659]: DEBUG nova.network.neutron [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.297124] env[59659]: INFO nova.compute.manager [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] Took 0.22 seconds to deallocate network for instance. [ 633.412998] env[59659]: INFO nova.scheduler.client.report [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Deleted allocations for instance 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a [ 633.422201] env[59659]: DEBUG nova.network.neutron [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.430995] env[59659]: DEBUG nova.network.neutron [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Successfully created port: b0fa307d-128a-44a3-990d-452c98019207 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 633.439425] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b5ae6b81-1aa9-4a97-9b4a-ac3fc144c56d tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Lock "4cdfac2f-6963-4f71-9a42-709f2eeb4f9a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.894s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.439825] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "4cdfac2f-6963-4f71-9a42-709f2eeb4f9a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 3.158s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.440043] env[59659]: INFO nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] [instance: 4cdfac2f-6963-4f71-9a42-709f2eeb4f9a] During sync_power_state the instance has a pending task (spawning). Skip. [ 633.441161] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "4cdfac2f-6963-4f71-9a42-709f2eeb4f9a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.464125] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Releasing lock "refresh_cache-b0f907a1-c4f4-4d02-9f07-8a640af4cdc4" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 633.464539] env[59659]: DEBUG nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 633.464716] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 633.465262] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4801bb76-31ac-4f9e-a318-bb8a1ec3c962 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.482914] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4978d1b5-8308-4b9e-bf32-e638929d7729 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.511394] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b0f907a1-c4f4-4d02-9f07-8a640af4cdc4 could not be found. [ 633.511750] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 633.511825] env[59659]: INFO nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Took 0.05 seconds to destroy the instance on the hypervisor. [ 633.512375] env[59659]: DEBUG oslo.service.loopingcall [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 633.512375] env[59659]: DEBUG nova.compute.manager [-] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 633.512375] env[59659]: DEBUG nova.network.neutron [-] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 633.567719] env[59659]: DEBUG nova.network.neutron [-] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 633.580084] env[59659]: DEBUG nova.network.neutron [-] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.591412] env[59659]: INFO nova.compute.manager [-] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Took 0.08 seconds to deallocate network for instance. [ 633.596220] env[59659]: DEBUG nova.compute.claims [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 633.596502] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.597155] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.725487] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acb41aab-5071-4747-9299-8c3d2b5ae368 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.737807] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b95254c-4f3b-4638-9f75-05fbbaf3b7d7 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.775186] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd24ef0a-b95e-4976-85ce-0b4e13235d28 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.780683] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Acquiring lock "dff5937a-0c12-46d4-878a-8c0e783c6695" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.780932] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Lock "dff5937a-0c12-46d4-878a-8c0e783c6695" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.787315] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7657c982-cc43-43e1-ac94-dbeb12103912 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.792154] env[59659]: DEBUG nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 633.809481] env[59659]: DEBUG nova.compute.provider_tree [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 633.822048] env[59659]: DEBUG nova.scheduler.client.report [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 633.846239] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.249s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 633.846872] env[59659]: ERROR nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 8ec22c7e-09a3-4c3f-bb3d-32092582ca44, please check neutron logs for more information. [ 633.846872] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Traceback (most recent call last): [ 633.846872] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 633.846872] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] self.driver.spawn(context, instance, image_meta, [ 633.846872] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 633.846872] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 633.846872] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 633.846872] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] vm_ref = self.build_virtual_machine(instance, [ 633.846872] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 633.846872] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] vif_infos = vmwarevif.get_vif_info(self._session, [ 633.846872] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 633.847244] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] for vif in network_info: [ 633.847244] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 633.847244] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] return self._sync_wrapper(fn, *args, **kwargs) [ 633.847244] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 633.847244] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] self.wait() [ 633.847244] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 633.847244] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] self[:] = self._gt.wait() [ 633.847244] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 633.847244] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] return self._exit_event.wait() [ 633.847244] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 633.847244] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] result = hub.switch() [ 633.847244] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 633.847244] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] return self.greenlet.switch() [ 633.847244] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 633.847572] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] result = function(*args, **kwargs) [ 633.847572] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 633.847572] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] return func(*args, **kwargs) [ 633.847572] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 633.847572] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] raise e [ 633.847572] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 633.847572] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] nwinfo = self.network_api.allocate_for_instance( [ 633.847572] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 633.847572] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] created_port_ids = self._update_ports_for_instance( [ 633.847572] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 633.847572] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] with excutils.save_and_reraise_exception(): [ 633.847572] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 633.847572] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] self.force_reraise() [ 633.847882] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 633.847882] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] raise self.value [ 633.847882] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 633.847882] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] updated_port = self._update_port( [ 633.847882] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 633.847882] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] _ensure_no_port_binding_failure(port) [ 633.847882] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 633.847882] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] raise exception.PortBindingFailed(port_id=port['id']) [ 633.847882] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] nova.exception.PortBindingFailed: Binding failed for port 8ec22c7e-09a3-4c3f-bb3d-32092582ca44, please check neutron logs for more information. [ 633.847882] env[59659]: ERROR nova.compute.manager [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] [ 633.847882] env[59659]: DEBUG nova.compute.utils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Binding failed for port 8ec22c7e-09a3-4c3f-bb3d-32092582ca44, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 633.861061] env[59659]: DEBUG nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Build of instance b0f907a1-c4f4-4d02-9f07-8a640af4cdc4 was re-scheduled: Binding failed for port 8ec22c7e-09a3-4c3f-bb3d-32092582ca44, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 633.861570] env[59659]: DEBUG nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 633.861831] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Acquiring lock "refresh_cache-b0f907a1-c4f4-4d02-9f07-8a640af4cdc4" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 633.861940] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Acquired lock "refresh_cache-b0f907a1-c4f4-4d02-9f07-8a640af4cdc4" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 633.862121] env[59659]: DEBUG nova.network.neutron [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 633.870153] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.870285] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.871892] env[59659]: INFO nova.compute.claims [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 633.936704] env[59659]: DEBUG nova.network.neutron [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 633.984561] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54ffe745-13a1-4346-88bf-982b025df599 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.994749] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e9456d7-609a-400e-a1a9-2f8173bf660e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.026389] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87b707c0-98f8-40cf-b1e2-65742fcfbe21 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.033868] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-136f363b-173a-46c2-b8f6-f1eaaecbe1a2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.047488] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 634.047855] env[59659]: DEBUG nova.compute.provider_tree [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 634.049595] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 634.050198] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Starting heal instance info cache {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 634.050198] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Rebuilding the list of instances to heal {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 634.060910] env[59659]: DEBUG nova.scheduler.client.report [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 634.063582] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Skipping network cache update for instance because it is Building. {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 634.063727] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Skipping network cache update for instance because it is Building. {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 634.063860] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Didn't find any instances for network info cache update. {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 634.064294] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 634.064577] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 634.064709] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 634.064894] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 634.065228] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 634.065409] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 634.065680] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59659) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 634.065743] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 634.072532] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.202s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.072975] env[59659]: DEBUG nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 634.081081] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.081081] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.081270] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.081634] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59659) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 634.082761] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-720173cf-8835-4d70-828c-9c572d28f59c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.092281] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34763825-0768-4a3b-987f-4883cffa5c1e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.108573] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60c42253-92aa-4b53-a41b-4a795862765f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.115494] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8352529-f4e7-4ae4-8470-d6d4066aa7db {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.120468] env[59659]: DEBUG nova.compute.utils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 634.124379] env[59659]: DEBUG nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 634.124379] env[59659]: DEBUG nova.network.neutron [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 634.151655] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181439MB free_disk=177GB free_vcpus=48 pci_devices=None {{(pid=59659) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 634.151844] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.152056] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.154983] env[59659]: DEBUG nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 634.245460] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Instance b0f907a1-c4f4-4d02-9f07-8a640af4cdc4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59659) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 634.245645] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Instance 4a21b251-816d-4668-9a2e-eeabd9ed347b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59659) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 634.245748] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Instance dff5937a-0c12-46d4-878a-8c0e783c6695 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59659) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 634.245910] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=59659) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 634.246063] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=59659) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 634.250597] env[59659]: DEBUG nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 634.278376] env[59659]: DEBUG nova.virt.hardware [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 634.278834] env[59659]: DEBUG nova.virt.hardware [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 634.278834] env[59659]: DEBUG nova.virt.hardware [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 634.278924] env[59659]: DEBUG nova.virt.hardware [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 634.279044] env[59659]: DEBUG nova.virt.hardware [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 634.279397] env[59659]: DEBUG nova.virt.hardware [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 634.279397] env[59659]: DEBUG nova.virt.hardware [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 634.279530] env[59659]: DEBUG nova.virt.hardware [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 634.279693] env[59659]: DEBUG nova.virt.hardware [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 634.279846] env[59659]: DEBUG nova.virt.hardware [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 634.280022] env[59659]: DEBUG nova.virt.hardware [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 634.280913] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c42f711-df3d-4ff6-bffc-2adb5372fb35 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.293208] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f08d456-884d-4967-a11d-46f15ad4dc9b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.333448] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c82a810e-ea43-409a-9c4b-22607993180a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.340863] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1c5b1ad-6330-471d-8326-78c554e7d209 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.371699] env[59659]: DEBUG nova.network.neutron [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 634.373374] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c86bae7d-1424-4a6c-8619-b81ebbde159d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.381535] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4c9ed24-ffe8-475b-bdf2-15509d0cb76d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.386701] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Releasing lock "refresh_cache-b0f907a1-c4f4-4d02-9f07-8a640af4cdc4" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 634.386701] env[59659]: DEBUG nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 634.386701] env[59659]: DEBUG nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 634.388638] env[59659]: DEBUG nova.network.neutron [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 634.398404] env[59659]: DEBUG nova.compute.provider_tree [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 634.408387] env[59659]: DEBUG nova.scheduler.client.report [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 634.425777] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59659) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 634.426032] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.274s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.430653] env[59659]: DEBUG nova.network.neutron [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 634.441356] env[59659]: DEBUG nova.network.neutron [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 634.453637] env[59659]: INFO nova.compute.manager [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] Took 0.07 seconds to deallocate network for instance. [ 634.476730] env[59659]: DEBUG nova.policy [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e4c24dd0950644258b36bb04734df984', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8b11b617cfb243bca8848f333b96f265', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 634.591335] env[59659]: INFO nova.scheduler.client.report [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Deleted allocations for instance b0f907a1-c4f4-4d02-9f07-8a640af4cdc4 [ 634.631770] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5f5a6212-82be-4bec-b863-18b40972328e tempest-ServerDiagnosticsTest-1663640938 tempest-ServerDiagnosticsTest-1663640938-project-member] Lock "b0f907a1-c4f4-4d02-9f07-8a640af4cdc4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.371s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.632116] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "b0f907a1-c4f4-4d02-9f07-8a640af4cdc4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 4.350s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.632230] env[59659]: INFO nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] [instance: b0f907a1-c4f4-4d02-9f07-8a640af4cdc4] During sync_power_state the instance has a pending task (spawning). Skip. [ 634.632387] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "b0f907a1-c4f4-4d02-9f07-8a640af4cdc4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.828054] env[59659]: DEBUG oslo_concurrency.lockutils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Acquiring lock "39f071f7-2895-4cf8-aa41-0e683397a2de" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.828054] env[59659]: DEBUG oslo_concurrency.lockutils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Lock "39f071f7-2895-4cf8-aa41-0e683397a2de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.841167] env[59659]: DEBUG nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 634.975101] env[59659]: DEBUG oslo_concurrency.lockutils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.975349] env[59659]: DEBUG oslo_concurrency.lockutils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.977185] env[59659]: INFO nova.compute.claims [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 635.093903] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5619ba8d-84be-4acf-bc91-48d442f77797 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.102191] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab57fa17-c1ab-4a44-b829-f984feb672ff {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.133590] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-380bd71c-5d7e-4c9d-a1dc-62583794b2f2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.141186] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1a0e0cc-fc09-4481-852a-7bfe4368cb78 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.154358] env[59659]: DEBUG nova.compute.provider_tree [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 635.166250] env[59659]: DEBUG nova.scheduler.client.report [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 635.181621] env[59659]: DEBUG oslo_concurrency.lockutils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.182493] env[59659]: DEBUG nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 635.229699] env[59659]: DEBUG nova.compute.utils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 635.231535] env[59659]: DEBUG nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 635.233055] env[59659]: DEBUG nova.network.neutron [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 635.245027] env[59659]: DEBUG nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 635.310370] env[59659]: DEBUG nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 635.333733] env[59659]: DEBUG nova.virt.hardware [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 635.333973] env[59659]: DEBUG nova.virt.hardware [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 635.334138] env[59659]: DEBUG nova.virt.hardware [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 635.334318] env[59659]: DEBUG nova.virt.hardware [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 635.334458] env[59659]: DEBUG nova.virt.hardware [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 635.334703] env[59659]: DEBUG nova.virt.hardware [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 635.334941] env[59659]: DEBUG nova.virt.hardware [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 635.335111] env[59659]: DEBUG nova.virt.hardware [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 635.335273] env[59659]: DEBUG nova.virt.hardware [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 635.335427] env[59659]: DEBUG nova.virt.hardware [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 635.335589] env[59659]: DEBUG nova.virt.hardware [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 635.336640] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78558ceb-a11b-4469-a740-eb895a379af3 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.344862] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2a98395-bc8d-4ca2-825e-d3da6992f79d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.446051] env[59659]: DEBUG nova.network.neutron [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Successfully created port: cefa5c6e-e484-4e7d-8eb3-6ba853f51118 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 635.559976] env[59659]: DEBUG nova.policy [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '746d830057bd42e49ba0fc3af8a22db6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dad9bedbe759426d9c98810f2789dbfd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 636.316449] env[59659]: DEBUG nova.network.neutron [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Successfully created port: 64711077-b1f4-4b70-b2fe-7552161f7d16 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 638.396415] env[59659]: DEBUG oslo_concurrency.lockutils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Acquiring lock "10fc8044-6912-412f-9b84-50efb0e9a398" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 638.396727] env[59659]: DEBUG oslo_concurrency.lockutils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Lock "10fc8044-6912-412f-9b84-50efb0e9a398" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 638.406817] env[59659]: ERROR nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port cefa5c6e-e484-4e7d-8eb3-6ba853f51118, please check neutron logs for more information. [ 638.406817] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 638.406817] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 638.406817] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 638.406817] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 638.406817] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 638.406817] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 638.406817] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 638.406817] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 638.406817] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 638.406817] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 638.406817] env[59659]: ERROR nova.compute.manager raise self.value [ 638.406817] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 638.406817] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 638.406817] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 638.406817] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 638.407292] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 638.407292] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 638.407292] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port cefa5c6e-e484-4e7d-8eb3-6ba853f51118, please check neutron logs for more information. [ 638.407292] env[59659]: ERROR nova.compute.manager [ 638.407292] env[59659]: Traceback (most recent call last): [ 638.407292] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 638.407292] env[59659]: listener.cb(fileno) [ 638.407292] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 638.407292] env[59659]: result = function(*args, **kwargs) [ 638.407292] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 638.407292] env[59659]: return func(*args, **kwargs) [ 638.407292] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 638.407292] env[59659]: raise e [ 638.407292] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 638.407292] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 638.407292] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 638.407292] env[59659]: created_port_ids = self._update_ports_for_instance( [ 638.408254] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 638.408254] env[59659]: with excutils.save_and_reraise_exception(): [ 638.408254] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 638.408254] env[59659]: self.force_reraise() [ 638.408254] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 638.408254] env[59659]: raise self.value [ 638.408254] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 638.408254] env[59659]: updated_port = self._update_port( [ 638.408254] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 638.408254] env[59659]: _ensure_no_port_binding_failure(port) [ 638.408254] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 638.408254] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 638.408254] env[59659]: nova.exception.PortBindingFailed: Binding failed for port cefa5c6e-e484-4e7d-8eb3-6ba853f51118, please check neutron logs for more information. [ 638.408254] env[59659]: Removing descriptor: 12 [ 638.409200] env[59659]: ERROR nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port cefa5c6e-e484-4e7d-8eb3-6ba853f51118, please check neutron logs for more information. [ 638.409200] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Traceback (most recent call last): [ 638.409200] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 638.409200] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] yield resources [ 638.409200] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 638.409200] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] self.driver.spawn(context, instance, image_meta, [ 638.409200] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 638.409200] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] self._vmops.spawn(context, instance, image_meta, injected_files, [ 638.409200] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 638.409200] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] vm_ref = self.build_virtual_machine(instance, [ 638.409200] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 638.409601] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] vif_infos = vmwarevif.get_vif_info(self._session, [ 638.409601] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 638.409601] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] for vif in network_info: [ 638.409601] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 638.409601] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] return self._sync_wrapper(fn, *args, **kwargs) [ 638.409601] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 638.409601] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] self.wait() [ 638.409601] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 638.409601] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] self[:] = self._gt.wait() [ 638.409601] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 638.409601] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] return self._exit_event.wait() [ 638.409601] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 638.409601] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] result = hub.switch() [ 638.409601] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 638.410055] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] return self.greenlet.switch() [ 638.410055] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 638.410055] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] result = function(*args, **kwargs) [ 638.410055] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 638.410055] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] return func(*args, **kwargs) [ 638.410055] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 638.410055] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] raise e [ 638.410055] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 638.410055] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] nwinfo = self.network_api.allocate_for_instance( [ 638.410055] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 638.410055] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] created_port_ids = self._update_ports_for_instance( [ 638.410055] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 638.410055] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] with excutils.save_and_reraise_exception(): [ 638.410374] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 638.410374] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] self.force_reraise() [ 638.410374] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 638.410374] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] raise self.value [ 638.410374] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 638.410374] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] updated_port = self._update_port( [ 638.410374] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 638.410374] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] _ensure_no_port_binding_failure(port) [ 638.410374] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 638.410374] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] raise exception.PortBindingFailed(port_id=port['id']) [ 638.410374] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] nova.exception.PortBindingFailed: Binding failed for port cefa5c6e-e484-4e7d-8eb3-6ba853f51118, please check neutron logs for more information. [ 638.410374] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] [ 638.410693] env[59659]: INFO nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Terminating instance [ 638.414353] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Acquiring lock "refresh_cache-dff5937a-0c12-46d4-878a-8c0e783c6695" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 638.414555] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Acquired lock "refresh_cache-dff5937a-0c12-46d4-878a-8c0e783c6695" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 638.414821] env[59659]: DEBUG nova.network.neutron [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 638.419841] env[59659]: DEBUG nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 638.470847] env[59659]: DEBUG nova.network.neutron [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 638.482434] env[59659]: DEBUG oslo_concurrency.lockutils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 638.482608] env[59659]: DEBUG oslo_concurrency.lockutils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 638.484394] env[59659]: INFO nova.compute.claims [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 638.541375] env[59659]: ERROR nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port b0fa307d-128a-44a3-990d-452c98019207, please check neutron logs for more information. [ 638.541375] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 638.541375] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 638.541375] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 638.541375] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 638.541375] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 638.541375] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 638.541375] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 638.541375] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 638.541375] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 638.541375] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 638.541375] env[59659]: ERROR nova.compute.manager raise self.value [ 638.541375] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 638.541375] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 638.541375] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 638.541375] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 638.542321] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 638.542321] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 638.542321] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port b0fa307d-128a-44a3-990d-452c98019207, please check neutron logs for more information. [ 638.542321] env[59659]: ERROR nova.compute.manager [ 638.542321] env[59659]: Traceback (most recent call last): [ 638.542321] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 638.542321] env[59659]: listener.cb(fileno) [ 638.542321] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 638.542321] env[59659]: result = function(*args, **kwargs) [ 638.542321] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 638.542321] env[59659]: return func(*args, **kwargs) [ 638.542321] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 638.542321] env[59659]: raise e [ 638.542321] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 638.542321] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 638.542321] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 638.542321] env[59659]: created_port_ids = self._update_ports_for_instance( [ 638.542321] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 638.542321] env[59659]: with excutils.save_and_reraise_exception(): [ 638.542321] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 638.542321] env[59659]: self.force_reraise() [ 638.542321] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 638.542321] env[59659]: raise self.value [ 638.542321] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 638.542321] env[59659]: updated_port = self._update_port( [ 638.542321] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 638.542321] env[59659]: _ensure_no_port_binding_failure(port) [ 638.542321] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 638.542321] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 638.543055] env[59659]: nova.exception.PortBindingFailed: Binding failed for port b0fa307d-128a-44a3-990d-452c98019207, please check neutron logs for more information. [ 638.543055] env[59659]: Removing descriptor: 15 [ 638.543055] env[59659]: ERROR nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port b0fa307d-128a-44a3-990d-452c98019207, please check neutron logs for more information. [ 638.543055] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Traceback (most recent call last): [ 638.543055] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 638.543055] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] yield resources [ 638.543055] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 638.543055] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] self.driver.spawn(context, instance, image_meta, [ 638.543055] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 638.543055] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 638.543055] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 638.543055] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] vm_ref = self.build_virtual_machine(instance, [ 638.543404] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 638.543404] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] vif_infos = vmwarevif.get_vif_info(self._session, [ 638.543404] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 638.543404] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] for vif in network_info: [ 638.543404] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 638.543404] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] return self._sync_wrapper(fn, *args, **kwargs) [ 638.543404] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 638.543404] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] self.wait() [ 638.543404] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 638.543404] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] self[:] = self._gt.wait() [ 638.543404] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 638.543404] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] return self._exit_event.wait() [ 638.543404] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 638.543404] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] result = hub.switch() [ 638.543906] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 638.543906] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] return self.greenlet.switch() [ 638.543906] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 638.543906] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] result = function(*args, **kwargs) [ 638.543906] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 638.543906] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] return func(*args, **kwargs) [ 638.543906] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 638.543906] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] raise e [ 638.543906] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 638.543906] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] nwinfo = self.network_api.allocate_for_instance( [ 638.543906] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 638.543906] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] created_port_ids = self._update_ports_for_instance( [ 638.543906] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 638.544340] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] with excutils.save_and_reraise_exception(): [ 638.544340] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 638.544340] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] self.force_reraise() [ 638.544340] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 638.544340] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] raise self.value [ 638.544340] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 638.544340] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] updated_port = self._update_port( [ 638.544340] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 638.544340] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] _ensure_no_port_binding_failure(port) [ 638.544340] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 638.544340] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] raise exception.PortBindingFailed(port_id=port['id']) [ 638.544340] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] nova.exception.PortBindingFailed: Binding failed for port b0fa307d-128a-44a3-990d-452c98019207, please check neutron logs for more information. [ 638.544340] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] [ 638.544674] env[59659]: INFO nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Terminating instance [ 638.544674] env[59659]: DEBUG oslo_concurrency.lockutils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Acquiring lock "refresh_cache-4a21b251-816d-4668-9a2e-eeabd9ed347b" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 638.545152] env[59659]: DEBUG oslo_concurrency.lockutils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Acquired lock "refresh_cache-4a21b251-816d-4668-9a2e-eeabd9ed347b" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 638.545152] env[59659]: DEBUG nova.network.neutron [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 638.636188] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e3c020c-8407-4c36-a0fd-615b4587d5a2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.646203] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00e335f0-d15d-442f-8c8f-005e083c87cf {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.649837] env[59659]: DEBUG nova.network.neutron [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 638.679446] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4e283de-8a13-4ffc-9786-d98f2fd426f0 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.687679] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a067fac-b8fd-4e90-be8e-48551e0bc961 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.706407] env[59659]: DEBUG nova.compute.provider_tree [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 638.718116] env[59659]: DEBUG nova.scheduler.client.report [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 638.740878] env[59659]: DEBUG nova.network.neutron [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 638.741847] env[59659]: DEBUG oslo_concurrency.lockutils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 638.742572] env[59659]: DEBUG nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 638.749240] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Releasing lock "refresh_cache-dff5937a-0c12-46d4-878a-8c0e783c6695" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 638.749423] env[59659]: DEBUG nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 638.749565] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 638.750044] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c6f2db72-8991-4918-8739-6863ccf07461 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.761020] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ae911e9-7591-43fc-8807-139546f4e0b5 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.786543] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance dff5937a-0c12-46d4-878a-8c0e783c6695 could not be found. [ 638.786996] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 638.787425] env[59659]: INFO nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Took 0.04 seconds to destroy the instance on the hypervisor. [ 638.788880] env[59659]: DEBUG oslo.service.loopingcall [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 638.788880] env[59659]: DEBUG nova.compute.utils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 638.791257] env[59659]: DEBUG nova.compute.manager [-] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 638.791356] env[59659]: DEBUG nova.network.neutron [-] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 638.793235] env[59659]: DEBUG nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 638.793408] env[59659]: DEBUG nova.network.neutron [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 638.802907] env[59659]: DEBUG nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 638.834773] env[59659]: DEBUG nova.network.neutron [-] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 638.854022] env[59659]: DEBUG nova.network.neutron [-] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 638.859953] env[59659]: INFO nova.compute.manager [-] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Took 0.07 seconds to deallocate network for instance. [ 638.863682] env[59659]: DEBUG nova.compute.claims [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 638.863961] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 638.864616] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 638.890981] env[59659]: DEBUG nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 638.928185] env[59659]: DEBUG nova.virt.hardware [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 638.928516] env[59659]: DEBUG nova.virt.hardware [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 638.928799] env[59659]: DEBUG nova.virt.hardware [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 638.929042] env[59659]: DEBUG nova.virt.hardware [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 638.929271] env[59659]: DEBUG nova.virt.hardware [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 638.929494] env[59659]: DEBUG nova.virt.hardware [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 638.929800] env[59659]: DEBUG nova.virt.hardware [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 638.930068] env[59659]: DEBUG nova.virt.hardware [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 638.930337] env[59659]: DEBUG nova.virt.hardware [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 638.930583] env[59659]: DEBUG nova.virt.hardware [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 638.930852] env[59659]: DEBUG nova.virt.hardware [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 638.932428] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bb15c9d-c4f1-4165-907f-b02fd21b3255 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.949746] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-721eaa08-ab20-4806-9891-c5231afd6061 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.011934] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65353b05-2ed2-4dcc-8326-748fc25011d0 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.021831] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f6dba5c-2cee-4e88-8456-a8a0c1183574 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.077090] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e049bb09-7d9d-4f2b-a470-6b148426a89f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.085260] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91a431f8-4e54-4b8b-903e-4c5fba57e71b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.098456] env[59659]: DEBUG nova.compute.provider_tree [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 639.109285] env[59659]: DEBUG nova.scheduler.client.report [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 639.130726] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.266s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 639.131246] env[59659]: ERROR nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port cefa5c6e-e484-4e7d-8eb3-6ba853f51118, please check neutron logs for more information. [ 639.131246] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Traceback (most recent call last): [ 639.131246] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 639.131246] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] self.driver.spawn(context, instance, image_meta, [ 639.131246] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 639.131246] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] self._vmops.spawn(context, instance, image_meta, injected_files, [ 639.131246] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 639.131246] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] vm_ref = self.build_virtual_machine(instance, [ 639.131246] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 639.131246] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] vif_infos = vmwarevif.get_vif_info(self._session, [ 639.131246] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 639.131734] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] for vif in network_info: [ 639.131734] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 639.131734] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] return self._sync_wrapper(fn, *args, **kwargs) [ 639.131734] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 639.131734] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] self.wait() [ 639.131734] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 639.131734] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] self[:] = self._gt.wait() [ 639.131734] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 639.131734] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] return self._exit_event.wait() [ 639.131734] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 639.131734] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] result = hub.switch() [ 639.131734] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 639.131734] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] return self.greenlet.switch() [ 639.131734] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 639.132673] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] result = function(*args, **kwargs) [ 639.132673] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 639.132673] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] return func(*args, **kwargs) [ 639.132673] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 639.132673] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] raise e [ 639.132673] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 639.132673] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] nwinfo = self.network_api.allocate_for_instance( [ 639.132673] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 639.132673] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] created_port_ids = self._update_ports_for_instance( [ 639.132673] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 639.132673] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] with excutils.save_and_reraise_exception(): [ 639.132673] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 639.132673] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] self.force_reraise() [ 639.133523] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 639.133523] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] raise self.value [ 639.133523] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 639.133523] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] updated_port = self._update_port( [ 639.133523] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 639.133523] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] _ensure_no_port_binding_failure(port) [ 639.133523] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 639.133523] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] raise exception.PortBindingFailed(port_id=port['id']) [ 639.133523] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] nova.exception.PortBindingFailed: Binding failed for port cefa5c6e-e484-4e7d-8eb3-6ba853f51118, please check neutron logs for more information. [ 639.133523] env[59659]: ERROR nova.compute.manager [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] [ 639.134133] env[59659]: DEBUG nova.compute.utils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Binding failed for port cefa5c6e-e484-4e7d-8eb3-6ba853f51118, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 639.134133] env[59659]: DEBUG nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Build of instance dff5937a-0c12-46d4-878a-8c0e783c6695 was re-scheduled: Binding failed for port cefa5c6e-e484-4e7d-8eb3-6ba853f51118, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 639.134133] env[59659]: DEBUG nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 639.134312] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Acquiring lock "refresh_cache-dff5937a-0c12-46d4-878a-8c0e783c6695" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 639.134448] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Acquired lock "refresh_cache-dff5937a-0c12-46d4-878a-8c0e783c6695" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 639.134602] env[59659]: DEBUG nova.network.neutron [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 639.187202] env[59659]: DEBUG nova.network.neutron [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 639.322186] env[59659]: DEBUG nova.network.neutron [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 639.334112] env[59659]: DEBUG oslo_concurrency.lockutils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Releasing lock "refresh_cache-4a21b251-816d-4668-9a2e-eeabd9ed347b" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 639.334112] env[59659]: DEBUG nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 639.334112] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 639.334112] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-db431441-d408-431e-b0d6-2e19f203dbd5 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.344315] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1184f66-4f34-41ac-a03c-247bac6800e9 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.368595] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4a21b251-816d-4668-9a2e-eeabd9ed347b could not be found. [ 639.368595] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 639.368595] env[59659]: INFO nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Took 0.04 seconds to destroy the instance on the hypervisor. [ 639.368595] env[59659]: DEBUG oslo.service.loopingcall [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 639.368595] env[59659]: DEBUG nova.compute.manager [-] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 639.368792] env[59659]: DEBUG nova.network.neutron [-] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 639.402129] env[59659]: DEBUG nova.policy [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f1957df6576d4aa19e41c96a046d136e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8c9fe156687d4221b76f0e662bd590a7', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 639.421989] env[59659]: ERROR nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 64711077-b1f4-4b70-b2fe-7552161f7d16, please check neutron logs for more information. [ 639.421989] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 639.421989] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 639.421989] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 639.421989] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 639.421989] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 639.421989] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 639.421989] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 639.421989] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 639.421989] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 639.421989] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 639.421989] env[59659]: ERROR nova.compute.manager raise self.value [ 639.421989] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 639.421989] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 639.421989] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 639.421989] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 639.422523] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 639.422523] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 639.422523] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 64711077-b1f4-4b70-b2fe-7552161f7d16, please check neutron logs for more information. [ 639.422523] env[59659]: ERROR nova.compute.manager [ 639.422523] env[59659]: Traceback (most recent call last): [ 639.422523] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 639.422523] env[59659]: listener.cb(fileno) [ 639.422523] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 639.422523] env[59659]: result = function(*args, **kwargs) [ 639.422523] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 639.422523] env[59659]: return func(*args, **kwargs) [ 639.422523] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 639.422523] env[59659]: raise e [ 639.422523] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 639.422523] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 639.422523] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 639.422523] env[59659]: created_port_ids = self._update_ports_for_instance( [ 639.422523] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 639.422523] env[59659]: with excutils.save_and_reraise_exception(): [ 639.422523] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 639.422523] env[59659]: self.force_reraise() [ 639.422523] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 639.422523] env[59659]: raise self.value [ 639.422523] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 639.422523] env[59659]: updated_port = self._update_port( [ 639.422523] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 639.422523] env[59659]: _ensure_no_port_binding_failure(port) [ 639.422523] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 639.422523] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 639.423292] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 64711077-b1f4-4b70-b2fe-7552161f7d16, please check neutron logs for more information. [ 639.423292] env[59659]: Removing descriptor: 16 [ 639.423292] env[59659]: ERROR nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 64711077-b1f4-4b70-b2fe-7552161f7d16, please check neutron logs for more information. [ 639.423292] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Traceback (most recent call last): [ 639.423292] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 639.423292] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] yield resources [ 639.423292] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 639.423292] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] self.driver.spawn(context, instance, image_meta, [ 639.423292] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 639.423292] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] self._vmops.spawn(context, instance, image_meta, injected_files, [ 639.423292] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 639.423292] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] vm_ref = self.build_virtual_machine(instance, [ 639.423640] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 639.423640] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] vif_infos = vmwarevif.get_vif_info(self._session, [ 639.423640] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 639.423640] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] for vif in network_info: [ 639.423640] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 639.423640] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] return self._sync_wrapper(fn, *args, **kwargs) [ 639.423640] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 639.423640] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] self.wait() [ 639.423640] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 639.423640] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] self[:] = self._gt.wait() [ 639.423640] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 639.423640] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] return self._exit_event.wait() [ 639.423640] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 639.423640] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] result = hub.switch() [ 639.424097] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 639.424097] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] return self.greenlet.switch() [ 639.424097] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 639.424097] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] result = function(*args, **kwargs) [ 639.424097] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 639.424097] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] return func(*args, **kwargs) [ 639.424097] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 639.424097] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] raise e [ 639.424097] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 639.424097] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] nwinfo = self.network_api.allocate_for_instance( [ 639.424097] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 639.424097] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] created_port_ids = self._update_ports_for_instance( [ 639.424097] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 639.424470] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] with excutils.save_and_reraise_exception(): [ 639.424470] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 639.424470] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] self.force_reraise() [ 639.424470] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 639.424470] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] raise self.value [ 639.424470] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 639.424470] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] updated_port = self._update_port( [ 639.424470] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 639.424470] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] _ensure_no_port_binding_failure(port) [ 639.424470] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 639.424470] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] raise exception.PortBindingFailed(port_id=port['id']) [ 639.424470] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] nova.exception.PortBindingFailed: Binding failed for port 64711077-b1f4-4b70-b2fe-7552161f7d16, please check neutron logs for more information. [ 639.424470] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] [ 639.425661] env[59659]: INFO nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Terminating instance [ 639.426063] env[59659]: DEBUG nova.network.neutron [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 639.430189] env[59659]: DEBUG oslo_concurrency.lockutils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Acquiring lock "refresh_cache-39f071f7-2895-4cf8-aa41-0e683397a2de" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 639.430189] env[59659]: DEBUG oslo_concurrency.lockutils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Acquired lock "refresh_cache-39f071f7-2895-4cf8-aa41-0e683397a2de" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 639.430189] env[59659]: DEBUG nova.network.neutron [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 639.443018] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Releasing lock "refresh_cache-dff5937a-0c12-46d4-878a-8c0e783c6695" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 639.443018] env[59659]: DEBUG nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 639.443018] env[59659]: DEBUG nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 639.443018] env[59659]: DEBUG nova.network.neutron [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 639.475537] env[59659]: DEBUG nova.network.neutron [-] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 639.488918] env[59659]: DEBUG nova.network.neutron [-] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 639.489391] env[59659]: DEBUG nova.network.neutron [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 639.492771] env[59659]: DEBUG nova.network.neutron [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 639.502057] env[59659]: DEBUG nova.network.neutron [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 639.503164] env[59659]: INFO nova.compute.manager [-] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Took 0.14 seconds to deallocate network for instance. [ 639.505091] env[59659]: DEBUG nova.compute.claims [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 639.505467] env[59659]: DEBUG oslo_concurrency.lockutils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 639.508300] env[59659]: DEBUG oslo_concurrency.lockutils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 639.512837] env[59659]: INFO nova.compute.manager [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] [instance: dff5937a-0c12-46d4-878a-8c0e783c6695] Took 0.07 seconds to deallocate network for instance. [ 639.634088] env[59659]: DEBUG nova.network.neutron [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 639.656381] env[59659]: DEBUG oslo_concurrency.lockutils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Releasing lock "refresh_cache-39f071f7-2895-4cf8-aa41-0e683397a2de" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 639.656381] env[59659]: DEBUG nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 639.656381] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 639.656381] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a4c0219b-56bc-419e-a372-185329f1d0a4 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.671365] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dddb084f-125f-4754-92cd-54e0566d6aeb {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.683402] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93a2717e-aadf-4960-876e-36af54e978cc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.687170] env[59659]: INFO nova.scheduler.client.report [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Deleted allocations for instance dff5937a-0c12-46d4-878a-8c0e783c6695 [ 639.715985] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9150b59-b543-4d06-9eeb-6cb67efdee77 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.720132] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 39f071f7-2895-4cf8-aa41-0e683397a2de could not be found. [ 639.720132] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 639.720324] env[59659]: INFO nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Took 0.07 seconds to destroy the instance on the hypervisor. [ 639.720696] env[59659]: DEBUG oslo.service.loopingcall [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 639.721564] env[59659]: DEBUG nova.compute.manager [-] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 639.724620] env[59659]: DEBUG nova.network.neutron [-] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 639.727011] env[59659]: DEBUG oslo_concurrency.lockutils [None req-ea7ab62e-3009-4f78-a5f1-fb437042f5ef tempest-ServersV294TestFqdnHostnames-1499296313 tempest-ServersV294TestFqdnHostnames-1499296313-project-member] Lock "dff5937a-0c12-46d4-878a-8c0e783c6695" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 5.946s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 639.760456] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e1b5ee5-9981-4035-a0b7-0386c5624395 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.770377] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4bad5ca-a527-4845-9d0b-d9b5ed514848 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.774396] env[59659]: DEBUG nova.network.neutron [-] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 639.787563] env[59659]: DEBUG nova.compute.provider_tree [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 639.792629] env[59659]: DEBUG nova.network.neutron [-] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 639.803090] env[59659]: DEBUG nova.scheduler.client.report [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 639.808570] env[59659]: INFO nova.compute.manager [-] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Took 0.09 seconds to deallocate network for instance. [ 639.810235] env[59659]: DEBUG nova.compute.claims [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 639.810235] env[59659]: DEBUG oslo_concurrency.lockutils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 639.829269] env[59659]: DEBUG oslo_concurrency.lockutils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.323s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 639.829879] env[59659]: ERROR nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port b0fa307d-128a-44a3-990d-452c98019207, please check neutron logs for more information. [ 639.829879] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Traceback (most recent call last): [ 639.829879] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 639.829879] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] self.driver.spawn(context, instance, image_meta, [ 639.829879] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 639.829879] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 639.829879] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 639.829879] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] vm_ref = self.build_virtual_machine(instance, [ 639.829879] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 639.829879] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] vif_infos = vmwarevif.get_vif_info(self._session, [ 639.829879] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 639.830278] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] for vif in network_info: [ 639.830278] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 639.830278] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] return self._sync_wrapper(fn, *args, **kwargs) [ 639.830278] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 639.830278] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] self.wait() [ 639.830278] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 639.830278] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] self[:] = self._gt.wait() [ 639.830278] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 639.830278] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] return self._exit_event.wait() [ 639.830278] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 639.830278] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] result = hub.switch() [ 639.830278] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 639.830278] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] return self.greenlet.switch() [ 639.830278] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 639.831052] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] result = function(*args, **kwargs) [ 639.831052] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 639.831052] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] return func(*args, **kwargs) [ 639.831052] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 639.831052] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] raise e [ 639.831052] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 639.831052] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] nwinfo = self.network_api.allocate_for_instance( [ 639.831052] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 639.831052] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] created_port_ids = self._update_ports_for_instance( [ 639.831052] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 639.831052] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] with excutils.save_and_reraise_exception(): [ 639.831052] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 639.831052] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] self.force_reraise() [ 639.831510] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 639.831510] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] raise self.value [ 639.831510] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 639.831510] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] updated_port = self._update_port( [ 639.831510] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 639.831510] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] _ensure_no_port_binding_failure(port) [ 639.831510] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 639.831510] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] raise exception.PortBindingFailed(port_id=port['id']) [ 639.831510] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] nova.exception.PortBindingFailed: Binding failed for port b0fa307d-128a-44a3-990d-452c98019207, please check neutron logs for more information. [ 639.831510] env[59659]: ERROR nova.compute.manager [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] [ 639.831510] env[59659]: DEBUG nova.compute.utils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Binding failed for port b0fa307d-128a-44a3-990d-452c98019207, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 639.832133] env[59659]: DEBUG oslo_concurrency.lockutils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.022s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 639.835048] env[59659]: DEBUG nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Build of instance 4a21b251-816d-4668-9a2e-eeabd9ed347b was re-scheduled: Binding failed for port b0fa307d-128a-44a3-990d-452c98019207, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 639.835430] env[59659]: DEBUG nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 639.835647] env[59659]: DEBUG oslo_concurrency.lockutils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Acquiring lock "refresh_cache-4a21b251-816d-4668-9a2e-eeabd9ed347b" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 639.835847] env[59659]: DEBUG oslo_concurrency.lockutils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Acquired lock "refresh_cache-4a21b251-816d-4668-9a2e-eeabd9ed347b" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 639.835931] env[59659]: DEBUG nova.network.neutron [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 639.879757] env[59659]: DEBUG nova.network.neutron [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 639.960046] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fda1575f-625f-49c8-82d8-5998c0575ff5 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.972247] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30aea3f9-0169-4127-8ac6-0ff0474833f1 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.011842] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8e401bf-f775-46c9-a5de-d498c2d663c6 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.022968] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38f0cfcb-b6ce-4fff-9db9-2977856087f0 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.039283] env[59659]: DEBUG nova.compute.provider_tree [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 640.050328] env[59659]: DEBUG nova.network.neutron [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.052545] env[59659]: DEBUG nova.scheduler.client.report [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 640.059965] env[59659]: DEBUG oslo_concurrency.lockutils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Releasing lock "refresh_cache-4a21b251-816d-4668-9a2e-eeabd9ed347b" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 640.060199] env[59659]: DEBUG nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 640.060349] env[59659]: DEBUG nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 640.060506] env[59659]: DEBUG nova.network.neutron [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 640.066078] env[59659]: DEBUG oslo_concurrency.lockutils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.234s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 640.066706] env[59659]: ERROR nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 64711077-b1f4-4b70-b2fe-7552161f7d16, please check neutron logs for more information. [ 640.066706] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Traceback (most recent call last): [ 640.066706] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 640.066706] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] self.driver.spawn(context, instance, image_meta, [ 640.066706] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 640.066706] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] self._vmops.spawn(context, instance, image_meta, injected_files, [ 640.066706] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 640.066706] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] vm_ref = self.build_virtual_machine(instance, [ 640.066706] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 640.066706] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] vif_infos = vmwarevif.get_vif_info(self._session, [ 640.066706] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 640.066991] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] for vif in network_info: [ 640.066991] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 640.066991] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] return self._sync_wrapper(fn, *args, **kwargs) [ 640.066991] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 640.066991] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] self.wait() [ 640.066991] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 640.066991] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] self[:] = self._gt.wait() [ 640.066991] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 640.066991] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] return self._exit_event.wait() [ 640.066991] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 640.066991] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] result = hub.switch() [ 640.066991] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 640.066991] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] return self.greenlet.switch() [ 640.066991] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 640.067376] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] result = function(*args, **kwargs) [ 640.067376] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 640.067376] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] return func(*args, **kwargs) [ 640.067376] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 640.067376] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] raise e [ 640.067376] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 640.067376] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] nwinfo = self.network_api.allocate_for_instance( [ 640.067376] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 640.067376] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] created_port_ids = self._update_ports_for_instance( [ 640.067376] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 640.067376] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] with excutils.save_and_reraise_exception(): [ 640.067376] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 640.067376] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] self.force_reraise() [ 640.067684] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 640.067684] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] raise self.value [ 640.067684] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 640.067684] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] updated_port = self._update_port( [ 640.067684] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 640.067684] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] _ensure_no_port_binding_failure(port) [ 640.067684] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 640.067684] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] raise exception.PortBindingFailed(port_id=port['id']) [ 640.067684] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] nova.exception.PortBindingFailed: Binding failed for port 64711077-b1f4-4b70-b2fe-7552161f7d16, please check neutron logs for more information. [ 640.067684] env[59659]: ERROR nova.compute.manager [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] [ 640.067684] env[59659]: DEBUG nova.compute.utils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Binding failed for port 64711077-b1f4-4b70-b2fe-7552161f7d16, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 640.071559] env[59659]: DEBUG nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Build of instance 39f071f7-2895-4cf8-aa41-0e683397a2de was re-scheduled: Binding failed for port 64711077-b1f4-4b70-b2fe-7552161f7d16, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 640.071995] env[59659]: DEBUG nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 640.072220] env[59659]: DEBUG oslo_concurrency.lockutils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Acquiring lock "refresh_cache-39f071f7-2895-4cf8-aa41-0e683397a2de" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 640.075578] env[59659]: DEBUG oslo_concurrency.lockutils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Acquired lock "refresh_cache-39f071f7-2895-4cf8-aa41-0e683397a2de" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 640.075578] env[59659]: DEBUG nova.network.neutron [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 640.094209] env[59659]: DEBUG nova.network.neutron [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 640.104163] env[59659]: DEBUG nova.network.neutron [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.112313] env[59659]: INFO nova.compute.manager [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] [instance: 4a21b251-816d-4668-9a2e-eeabd9ed347b] Took 0.05 seconds to deallocate network for instance. [ 640.117268] env[59659]: DEBUG nova.network.neutron [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 640.222390] env[59659]: INFO nova.scheduler.client.report [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Deleted allocations for instance 4a21b251-816d-4668-9a2e-eeabd9ed347b [ 640.243111] env[59659]: DEBUG oslo_concurrency.lockutils [None req-e65b6231-b29f-4a1a-bf29-cdf65abe061a tempest-TenantUsagesTestJSON-217578962 tempest-TenantUsagesTestJSON-217578962-project-member] Lock "4a21b251-816d-4668-9a2e-eeabd9ed347b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.912s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 640.434085] env[59659]: DEBUG nova.network.neutron [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.446181] env[59659]: DEBUG oslo_concurrency.lockutils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Releasing lock "refresh_cache-39f071f7-2895-4cf8-aa41-0e683397a2de" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 640.446431] env[59659]: DEBUG nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 640.446592] env[59659]: DEBUG nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 640.446762] env[59659]: DEBUG nova.network.neutron [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 640.473335] env[59659]: DEBUG nova.network.neutron [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 640.482720] env[59659]: DEBUG nova.network.neutron [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.496247] env[59659]: INFO nova.compute.manager [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] [instance: 39f071f7-2895-4cf8-aa41-0e683397a2de] Took 0.05 seconds to deallocate network for instance. [ 640.613454] env[59659]: INFO nova.scheduler.client.report [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Deleted allocations for instance 39f071f7-2895-4cf8-aa41-0e683397a2de [ 640.644295] env[59659]: DEBUG oslo_concurrency.lockutils [None req-22a13f65-88df-47bd-a90b-d541d8a36016 tempest-ServerExternalEventsTest-103728035 tempest-ServerExternalEventsTest-103728035-project-member] Lock "39f071f7-2895-4cf8-aa41-0e683397a2de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 5.816s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 641.437940] env[59659]: DEBUG nova.network.neutron [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Successfully created port: af2b4893-ceaf-45d1-a1df-5e3041a748f9 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 642.454225] env[59659]: DEBUG oslo_concurrency.lockutils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Acquiring lock "dd588677-08d1-43d8-bff3-62b655a5a194" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 642.455277] env[59659]: DEBUG oslo_concurrency.lockutils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Lock "dd588677-08d1-43d8-bff3-62b655a5a194" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 642.470781] env[59659]: DEBUG nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 642.527353] env[59659]: DEBUG oslo_concurrency.lockutils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 642.527718] env[59659]: DEBUG oslo_concurrency.lockutils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 642.529173] env[59659]: INFO nova.compute.claims [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 642.658109] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8180968-7bc0-472e-8aad-daa6e8d53037 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.665498] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdb3924b-35a0-40ab-bc5c-00393b579387 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.703190] env[59659]: DEBUG nova.network.neutron [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Successfully created port: 9165c9cb-f774-41e9-8027-3bd201a2a306 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 642.705006] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a873cd3e-0c7f-4812-9b5b-ede62c536665 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.712874] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48a0b909-e3d9-4c2d-9200-581ba1f48ec8 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.728266] env[59659]: DEBUG nova.compute.provider_tree [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 642.740139] env[59659]: DEBUG nova.scheduler.client.report [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 642.764098] env[59659]: DEBUG oslo_concurrency.lockutils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 642.764606] env[59659]: DEBUG nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 642.805350] env[59659]: DEBUG nova.compute.utils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 642.806726] env[59659]: DEBUG nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 642.806953] env[59659]: DEBUG nova.network.neutron [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 642.822276] env[59659]: DEBUG nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 642.899168] env[59659]: DEBUG nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 642.926200] env[59659]: DEBUG nova.virt.hardware [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 642.926200] env[59659]: DEBUG nova.virt.hardware [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 642.926200] env[59659]: DEBUG nova.virt.hardware [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 642.926586] env[59659]: DEBUG nova.virt.hardware [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 642.926586] env[59659]: DEBUG nova.virt.hardware [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 642.926586] env[59659]: DEBUG nova.virt.hardware [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 642.926586] env[59659]: DEBUG nova.virt.hardware [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 642.926586] env[59659]: DEBUG nova.virt.hardware [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 642.926730] env[59659]: DEBUG nova.virt.hardware [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 642.926730] env[59659]: DEBUG nova.virt.hardware [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 642.926841] env[59659]: DEBUG nova.virt.hardware [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 642.928719] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4098504-2d61-46fc-a414-feee8cc90ed2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.943728] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c2d9291-963d-4e9e-9ce5-4ecf7e08ddee {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 643.078228] env[59659]: DEBUG nova.policy [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ddd7a5cce0914c8cbd4698144cfb5be5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e35d0d53e9be49c4812e6268e521dfaf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 644.107429] env[59659]: DEBUG nova.network.neutron [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Successfully created port: 5619df1e-12bd-47f0-811c-2734fb1a77d2 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 644.796272] env[59659]: DEBUG nova.network.neutron [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Successfully created port: d2091aef-34df-49c0-a615-b43ef3305034 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 652.627231] env[59659]: DEBUG nova.compute.manager [req-efd81a84-0f5b-4c8c-8afb-6d27061ff56b req-016f0b7d-e721-4f16-859d-0d97432554ca service nova] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Received event network-changed-af2b4893-ceaf-45d1-a1df-5e3041a748f9 {{(pid=59659) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 652.627231] env[59659]: DEBUG nova.compute.manager [req-efd81a84-0f5b-4c8c-8afb-6d27061ff56b req-016f0b7d-e721-4f16-859d-0d97432554ca service nova] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Refreshing instance network info cache due to event network-changed-af2b4893-ceaf-45d1-a1df-5e3041a748f9. {{(pid=59659) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 652.627231] env[59659]: DEBUG oslo_concurrency.lockutils [req-efd81a84-0f5b-4c8c-8afb-6d27061ff56b req-016f0b7d-e721-4f16-859d-0d97432554ca service nova] Acquiring lock "refresh_cache-10fc8044-6912-412f-9b84-50efb0e9a398" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 652.627231] env[59659]: DEBUG oslo_concurrency.lockutils [req-efd81a84-0f5b-4c8c-8afb-6d27061ff56b req-016f0b7d-e721-4f16-859d-0d97432554ca service nova] Acquired lock "refresh_cache-10fc8044-6912-412f-9b84-50efb0e9a398" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 652.627231] env[59659]: DEBUG nova.network.neutron [req-efd81a84-0f5b-4c8c-8afb-6d27061ff56b req-016f0b7d-e721-4f16-859d-0d97432554ca service nova] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Refreshing network info cache for port af2b4893-ceaf-45d1-a1df-5e3041a748f9 {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 652.659555] env[59659]: ERROR nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port af2b4893-ceaf-45d1-a1df-5e3041a748f9, please check neutron logs for more information. [ 652.659555] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 652.659555] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 652.659555] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 652.659555] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 652.659555] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 652.659555] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 652.659555] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 652.659555] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 652.659555] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 652.659555] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 652.659555] env[59659]: ERROR nova.compute.manager raise self.value [ 652.659555] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 652.659555] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 652.659555] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 652.659555] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 652.659981] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 652.659981] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 652.659981] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port af2b4893-ceaf-45d1-a1df-5e3041a748f9, please check neutron logs for more information. [ 652.659981] env[59659]: ERROR nova.compute.manager [ 652.659981] env[59659]: Traceback (most recent call last): [ 652.659981] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 652.659981] env[59659]: listener.cb(fileno) [ 652.659981] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 652.659981] env[59659]: result = function(*args, **kwargs) [ 652.659981] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 652.659981] env[59659]: return func(*args, **kwargs) [ 652.659981] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 652.659981] env[59659]: raise e [ 652.659981] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 652.659981] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 652.659981] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 652.659981] env[59659]: created_port_ids = self._update_ports_for_instance( [ 652.659981] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 652.659981] env[59659]: with excutils.save_and_reraise_exception(): [ 652.659981] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 652.659981] env[59659]: self.force_reraise() [ 652.659981] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 652.659981] env[59659]: raise self.value [ 652.659981] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 652.659981] env[59659]: updated_port = self._update_port( [ 652.659981] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 652.659981] env[59659]: _ensure_no_port_binding_failure(port) [ 652.659981] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 652.659981] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 652.661252] env[59659]: nova.exception.PortBindingFailed: Binding failed for port af2b4893-ceaf-45d1-a1df-5e3041a748f9, please check neutron logs for more information. [ 652.661252] env[59659]: Removing descriptor: 14 [ 652.661419] env[59659]: ERROR nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port af2b4893-ceaf-45d1-a1df-5e3041a748f9, please check neutron logs for more information. [ 652.661419] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Traceback (most recent call last): [ 652.661419] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 652.661419] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] yield resources [ 652.661419] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 652.661419] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] self.driver.spawn(context, instance, image_meta, [ 652.661419] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 652.661419] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] self._vmops.spawn(context, instance, image_meta, injected_files, [ 652.661419] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 652.661419] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] vm_ref = self.build_virtual_machine(instance, [ 652.661419] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 652.661692] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] vif_infos = vmwarevif.get_vif_info(self._session, [ 652.661692] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 652.661692] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] for vif in network_info: [ 652.661692] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 652.661692] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] return self._sync_wrapper(fn, *args, **kwargs) [ 652.661692] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 652.661692] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] self.wait() [ 652.661692] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 652.661692] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] self[:] = self._gt.wait() [ 652.661692] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 652.661692] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] return self._exit_event.wait() [ 652.661692] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 652.661692] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] result = hub.switch() [ 652.661692] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 652.662170] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] return self.greenlet.switch() [ 652.662170] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 652.662170] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] result = function(*args, **kwargs) [ 652.662170] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 652.662170] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] return func(*args, **kwargs) [ 652.662170] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 652.662170] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] raise e [ 652.662170] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 652.662170] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] nwinfo = self.network_api.allocate_for_instance( [ 652.662170] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 652.662170] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] created_port_ids = self._update_ports_for_instance( [ 652.662170] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 652.662170] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] with excutils.save_and_reraise_exception(): [ 652.662609] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 652.662609] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] self.force_reraise() [ 652.662609] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 652.662609] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] raise self.value [ 652.662609] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 652.662609] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] updated_port = self._update_port( [ 652.662609] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 652.662609] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] _ensure_no_port_binding_failure(port) [ 652.662609] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 652.662609] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] raise exception.PortBindingFailed(port_id=port['id']) [ 652.662609] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] nova.exception.PortBindingFailed: Binding failed for port af2b4893-ceaf-45d1-a1df-5e3041a748f9, please check neutron logs for more information. [ 652.662609] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] [ 652.662912] env[59659]: INFO nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Terminating instance [ 652.668537] env[59659]: DEBUG oslo_concurrency.lockutils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Acquiring lock "refresh_cache-10fc8044-6912-412f-9b84-50efb0e9a398" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 652.825362] env[59659]: DEBUG nova.network.neutron [req-efd81a84-0f5b-4c8c-8afb-6d27061ff56b req-016f0b7d-e721-4f16-859d-0d97432554ca service nova] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 653.505334] env[59659]: DEBUG nova.network.neutron [req-efd81a84-0f5b-4c8c-8afb-6d27061ff56b req-016f0b7d-e721-4f16-859d-0d97432554ca service nova] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 653.518491] env[59659]: DEBUG oslo_concurrency.lockutils [req-efd81a84-0f5b-4c8c-8afb-6d27061ff56b req-016f0b7d-e721-4f16-859d-0d97432554ca service nova] Releasing lock "refresh_cache-10fc8044-6912-412f-9b84-50efb0e9a398" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 653.519020] env[59659]: DEBUG oslo_concurrency.lockutils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Acquired lock "refresh_cache-10fc8044-6912-412f-9b84-50efb0e9a398" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 653.519311] env[59659]: DEBUG nova.network.neutron [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 653.653969] env[59659]: ERROR nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port d2091aef-34df-49c0-a615-b43ef3305034, please check neutron logs for more information. [ 653.653969] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 653.653969] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 653.653969] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 653.653969] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 653.653969] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 653.653969] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 653.653969] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 653.653969] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 653.653969] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 653.653969] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 653.653969] env[59659]: ERROR nova.compute.manager raise self.value [ 653.653969] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 653.653969] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 653.653969] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 653.653969] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 653.654650] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 653.654650] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 653.654650] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port d2091aef-34df-49c0-a615-b43ef3305034, please check neutron logs for more information. [ 653.654650] env[59659]: ERROR nova.compute.manager [ 653.655458] env[59659]: Traceback (most recent call last): [ 653.655581] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 653.655581] env[59659]: listener.cb(fileno) [ 653.655725] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 653.655725] env[59659]: result = function(*args, **kwargs) [ 653.655807] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 653.655807] env[59659]: return func(*args, **kwargs) [ 653.655873] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 653.655873] env[59659]: raise e [ 653.655938] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 653.655938] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 653.656008] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 653.656008] env[59659]: created_port_ids = self._update_ports_for_instance( [ 653.656084] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 653.656084] env[59659]: with excutils.save_and_reraise_exception(): [ 653.656147] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 653.656147] env[59659]: self.force_reraise() [ 653.656213] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 653.656213] env[59659]: raise self.value [ 653.656277] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 653.656277] env[59659]: updated_port = self._update_port( [ 653.656526] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 653.656526] env[59659]: _ensure_no_port_binding_failure(port) [ 653.656526] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 653.656526] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 653.656526] env[59659]: nova.exception.PortBindingFailed: Binding failed for port d2091aef-34df-49c0-a615-b43ef3305034, please check neutron logs for more information. [ 653.656526] env[59659]: Removing descriptor: 12 [ 653.659294] env[59659]: ERROR nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port d2091aef-34df-49c0-a615-b43ef3305034, please check neutron logs for more information. [ 653.659294] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Traceback (most recent call last): [ 653.659294] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 653.659294] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] yield resources [ 653.659294] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 653.659294] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] self.driver.spawn(context, instance, image_meta, [ 653.659294] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 653.659294] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] self._vmops.spawn(context, instance, image_meta, injected_files, [ 653.659294] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 653.659294] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] vm_ref = self.build_virtual_machine(instance, [ 653.659294] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 653.659818] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] vif_infos = vmwarevif.get_vif_info(self._session, [ 653.659818] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 653.659818] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] for vif in network_info: [ 653.659818] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 653.659818] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] return self._sync_wrapper(fn, *args, **kwargs) [ 653.659818] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 653.659818] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] self.wait() [ 653.659818] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 653.659818] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] self[:] = self._gt.wait() [ 653.659818] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 653.659818] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] return self._exit_event.wait() [ 653.659818] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 653.659818] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] result = hub.switch() [ 653.659818] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 653.660218] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] return self.greenlet.switch() [ 653.660218] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 653.660218] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] result = function(*args, **kwargs) [ 653.660218] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 653.660218] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] return func(*args, **kwargs) [ 653.660218] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 653.660218] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] raise e [ 653.660218] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 653.660218] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] nwinfo = self.network_api.allocate_for_instance( [ 653.660218] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 653.660218] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] created_port_ids = self._update_ports_for_instance( [ 653.660218] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 653.660218] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] with excutils.save_and_reraise_exception(): [ 653.664442] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 653.664442] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] self.force_reraise() [ 653.664442] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 653.664442] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] raise self.value [ 653.664442] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 653.664442] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] updated_port = self._update_port( [ 653.664442] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 653.664442] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] _ensure_no_port_binding_failure(port) [ 653.664442] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 653.664442] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] raise exception.PortBindingFailed(port_id=port['id']) [ 653.664442] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] nova.exception.PortBindingFailed: Binding failed for port d2091aef-34df-49c0-a615-b43ef3305034, please check neutron logs for more information. [ 653.664442] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] [ 653.664781] env[59659]: INFO nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Terminating instance [ 653.666925] env[59659]: DEBUG oslo_concurrency.lockutils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Acquiring lock "refresh_cache-dd588677-08d1-43d8-bff3-62b655a5a194" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 653.666925] env[59659]: DEBUG oslo_concurrency.lockutils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Acquired lock "refresh_cache-dd588677-08d1-43d8-bff3-62b655a5a194" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 653.666925] env[59659]: DEBUG nova.network.neutron [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 653.668569] env[59659]: DEBUG nova.network.neutron [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 653.764742] env[59659]: DEBUG nova.network.neutron [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 653.776200] env[59659]: DEBUG nova.compute.manager [req-9693ccf5-3824-4c14-8e34-84354581ad2e req-8e8f7fe3-b5be-4f26-8b76-921246e62915 service nova] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Received event network-changed-d2091aef-34df-49c0-a615-b43ef3305034 {{(pid=59659) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 653.776450] env[59659]: DEBUG nova.compute.manager [req-9693ccf5-3824-4c14-8e34-84354581ad2e req-8e8f7fe3-b5be-4f26-8b76-921246e62915 service nova] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Refreshing instance network info cache due to event network-changed-d2091aef-34df-49c0-a615-b43ef3305034. {{(pid=59659) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 653.776637] env[59659]: DEBUG oslo_concurrency.lockutils [req-9693ccf5-3824-4c14-8e34-84354581ad2e req-8e8f7fe3-b5be-4f26-8b76-921246e62915 service nova] Acquiring lock "refresh_cache-dd588677-08d1-43d8-bff3-62b655a5a194" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 654.286107] env[59659]: DEBUG nova.network.neutron [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 654.303681] env[59659]: DEBUG oslo_concurrency.lockutils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Releasing lock "refresh_cache-dd588677-08d1-43d8-bff3-62b655a5a194" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 654.304153] env[59659]: DEBUG nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 654.304349] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 654.304750] env[59659]: DEBUG oslo_concurrency.lockutils [req-9693ccf5-3824-4c14-8e34-84354581ad2e req-8e8f7fe3-b5be-4f26-8b76-921246e62915 service nova] Acquired lock "refresh_cache-dd588677-08d1-43d8-bff3-62b655a5a194" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 654.304846] env[59659]: DEBUG nova.network.neutron [req-9693ccf5-3824-4c14-8e34-84354581ad2e req-8e8f7fe3-b5be-4f26-8b76-921246e62915 service nova] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Refreshing network info cache for port d2091aef-34df-49c0-a615-b43ef3305034 {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 654.306966] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6f50a3e0-bbbf-42bc-810d-c16a384ee5f9 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.322085] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7578fc31-7b99-42d1-b3d5-3e476b5e1639 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.346971] env[59659]: DEBUG nova.network.neutron [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 654.354064] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance dd588677-08d1-43d8-bff3-62b655a5a194 could not be found. [ 654.354064] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 654.354184] env[59659]: INFO nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Took 0.05 seconds to destroy the instance on the hypervisor. [ 654.356041] env[59659]: DEBUG oslo.service.loopingcall [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 654.356041] env[59659]: DEBUG nova.compute.manager [-] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 654.356041] env[59659]: DEBUG nova.network.neutron [-] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 654.372025] env[59659]: DEBUG oslo_concurrency.lockutils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Releasing lock "refresh_cache-10fc8044-6912-412f-9b84-50efb0e9a398" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 654.372025] env[59659]: DEBUG nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 654.372025] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 654.373019] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1d8f4e7b-ad3e-4760-9280-8fb5c1923416 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.393018] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-024ca683-4738-417e-8d30-368b0f314dd5 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.417604] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 10fc8044-6912-412f-9b84-50efb0e9a398 could not be found. [ 654.417910] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 654.421020] env[59659]: INFO nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Took 0.05 seconds to destroy the instance on the hypervisor. [ 654.421020] env[59659]: DEBUG oslo.service.loopingcall [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 654.421020] env[59659]: DEBUG nova.compute.manager [-] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 654.421020] env[59659]: DEBUG nova.network.neutron [-] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 654.479180] env[59659]: DEBUG nova.network.neutron [-] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 654.641636] env[59659]: DEBUG nova.network.neutron [req-9693ccf5-3824-4c14-8e34-84354581ad2e req-8e8f7fe3-b5be-4f26-8b76-921246e62915 service nova] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 654.651777] env[59659]: DEBUG nova.network.neutron [-] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 654.663654] env[59659]: DEBUG nova.network.neutron [-] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 654.687798] env[59659]: INFO nova.compute.manager [-] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Took 0.33 seconds to deallocate network for instance. [ 654.692584] env[59659]: DEBUG nova.compute.claims [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 654.692748] env[59659]: DEBUG oslo_concurrency.lockutils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 654.692955] env[59659]: DEBUG oslo_concurrency.lockutils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 654.797716] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-675a757f-8203-41d1-bd6c-d35a28c235f2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.805641] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f6916ec-ec5b-4c8a-a446-9ad14489b199 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.839988] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7fd2462-118a-477a-9333-7057c6d49134 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.847443] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63a1a4a2-1d3c-4543-a4a1-b666e945befd {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.860929] env[59659]: DEBUG nova.compute.provider_tree [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 654.869035] env[59659]: DEBUG nova.scheduler.client.report [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 654.887886] env[59659]: DEBUG oslo_concurrency.lockutils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.195s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 654.888284] env[59659]: ERROR nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port d2091aef-34df-49c0-a615-b43ef3305034, please check neutron logs for more information. [ 654.888284] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Traceback (most recent call last): [ 654.888284] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 654.888284] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] self.driver.spawn(context, instance, image_meta, [ 654.888284] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 654.888284] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] self._vmops.spawn(context, instance, image_meta, injected_files, [ 654.888284] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 654.888284] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] vm_ref = self.build_virtual_machine(instance, [ 654.888284] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 654.888284] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] vif_infos = vmwarevif.get_vif_info(self._session, [ 654.888284] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 654.888594] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] for vif in network_info: [ 654.888594] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 654.888594] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] return self._sync_wrapper(fn, *args, **kwargs) [ 654.888594] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 654.888594] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] self.wait() [ 654.888594] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 654.888594] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] self[:] = self._gt.wait() [ 654.888594] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 654.888594] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] return self._exit_event.wait() [ 654.888594] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 654.888594] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] result = hub.switch() [ 654.888594] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 654.888594] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] return self.greenlet.switch() [ 654.888594] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 654.889019] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] result = function(*args, **kwargs) [ 654.889019] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 654.889019] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] return func(*args, **kwargs) [ 654.889019] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 654.889019] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] raise e [ 654.889019] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 654.889019] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] nwinfo = self.network_api.allocate_for_instance( [ 654.889019] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 654.889019] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] created_port_ids = self._update_ports_for_instance( [ 654.889019] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 654.889019] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] with excutils.save_and_reraise_exception(): [ 654.889019] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 654.889019] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] self.force_reraise() [ 654.889383] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 654.889383] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] raise self.value [ 654.889383] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 654.889383] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] updated_port = self._update_port( [ 654.889383] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 654.889383] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] _ensure_no_port_binding_failure(port) [ 654.889383] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 654.889383] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] raise exception.PortBindingFailed(port_id=port['id']) [ 654.889383] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] nova.exception.PortBindingFailed: Binding failed for port d2091aef-34df-49c0-a615-b43ef3305034, please check neutron logs for more information. [ 654.889383] env[59659]: ERROR nova.compute.manager [instance: dd588677-08d1-43d8-bff3-62b655a5a194] [ 654.889655] env[59659]: DEBUG nova.compute.utils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Binding failed for port d2091aef-34df-49c0-a615-b43ef3305034, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 654.891160] env[59659]: DEBUG nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Build of instance dd588677-08d1-43d8-bff3-62b655a5a194 was re-scheduled: Binding failed for port d2091aef-34df-49c0-a615-b43ef3305034, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 654.891160] env[59659]: DEBUG nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 654.891272] env[59659]: DEBUG oslo_concurrency.lockutils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Acquiring lock "refresh_cache-dd588677-08d1-43d8-bff3-62b655a5a194" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 654.988397] env[59659]: DEBUG nova.network.neutron [req-9693ccf5-3824-4c14-8e34-84354581ad2e req-8e8f7fe3-b5be-4f26-8b76-921246e62915 service nova] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 655.003108] env[59659]: DEBUG oslo_concurrency.lockutils [req-9693ccf5-3824-4c14-8e34-84354581ad2e req-8e8f7fe3-b5be-4f26-8b76-921246e62915 service nova] Releasing lock "refresh_cache-dd588677-08d1-43d8-bff3-62b655a5a194" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 655.003221] env[59659]: DEBUG oslo_concurrency.lockutils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Acquired lock "refresh_cache-dd588677-08d1-43d8-bff3-62b655a5a194" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 655.003339] env[59659]: DEBUG nova.network.neutron [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 655.081845] env[59659]: DEBUG nova.network.neutron [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 655.454046] env[59659]: DEBUG nova.network.neutron [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 655.467337] env[59659]: DEBUG oslo_concurrency.lockutils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Releasing lock "refresh_cache-dd588677-08d1-43d8-bff3-62b655a5a194" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 655.467567] env[59659]: DEBUG nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 655.467937] env[59659]: DEBUG nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 655.468471] env[59659]: DEBUG nova.network.neutron [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 655.537172] env[59659]: DEBUG nova.network.neutron [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 655.545527] env[59659]: DEBUG nova.network.neutron [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 655.554755] env[59659]: INFO nova.compute.manager [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] [instance: dd588677-08d1-43d8-bff3-62b655a5a194] Took 0.09 seconds to deallocate network for instance. [ 655.667644] env[59659]: INFO nova.scheduler.client.report [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Deleted allocations for instance dd588677-08d1-43d8-bff3-62b655a5a194 [ 655.687320] env[59659]: DEBUG oslo_concurrency.lockutils [None req-056bcff0-52e4-4165-b6bc-eca6ea1b5ae6 tempest-DeleteServersAdminTestJSON-1437841386 tempest-DeleteServersAdminTestJSON-1437841386-project-member] Lock "dd588677-08d1-43d8-bff3-62b655a5a194" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.232s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 657.765334] env[59659]: DEBUG nova.compute.manager [req-86fac4cb-62b7-4886-8999-e52d7ea1516b req-55254cbb-e2ca-453f-9663-740075cef2e0 service nova] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Received event network-vif-deleted-af2b4893-ceaf-45d1-a1df-5e3041a748f9 {{(pid=59659) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 658.262759] env[59659]: DEBUG nova.network.neutron [-] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 658.280435] env[59659]: INFO nova.compute.manager [-] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Took 3.86 seconds to deallocate network for instance. [ 658.284616] env[59659]: DEBUG nova.compute.claims [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 658.284944] env[59659]: DEBUG oslo_concurrency.lockutils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 658.285548] env[59659]: DEBUG oslo_concurrency.lockutils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 658.373666] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ba810f4-552d-454c-b079-5bfd47ee0562 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.383105] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a9775bf-22c2-4b9c-85b6-bf01843c4645 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.415298] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c56990dd-1817-4ac8-afc4-2525ae691381 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.423531] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-159d19c2-0ad3-4d5e-aa3b-c051cb6b8d41 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.438521] env[59659]: DEBUG nova.compute.provider_tree [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 658.449132] env[59659]: DEBUG nova.scheduler.client.report [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 658.462915] env[59659]: DEBUG oslo_concurrency.lockutils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.177s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 658.463551] env[59659]: ERROR nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port af2b4893-ceaf-45d1-a1df-5e3041a748f9, please check neutron logs for more information. [ 658.463551] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Traceback (most recent call last): [ 658.463551] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 658.463551] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] self.driver.spawn(context, instance, image_meta, [ 658.463551] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 658.463551] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] self._vmops.spawn(context, instance, image_meta, injected_files, [ 658.463551] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 658.463551] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] vm_ref = self.build_virtual_machine(instance, [ 658.463551] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 658.463551] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] vif_infos = vmwarevif.get_vif_info(self._session, [ 658.463551] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 658.463877] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] for vif in network_info: [ 658.463877] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 658.463877] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] return self._sync_wrapper(fn, *args, **kwargs) [ 658.463877] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 658.463877] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] self.wait() [ 658.463877] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 658.463877] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] self[:] = self._gt.wait() [ 658.463877] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 658.463877] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] return self._exit_event.wait() [ 658.463877] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 658.463877] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] result = hub.switch() [ 658.463877] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 658.463877] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] return self.greenlet.switch() [ 658.463877] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 658.464266] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] result = function(*args, **kwargs) [ 658.464266] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 658.464266] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] return func(*args, **kwargs) [ 658.464266] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 658.464266] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] raise e [ 658.464266] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 658.464266] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] nwinfo = self.network_api.allocate_for_instance( [ 658.464266] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 658.464266] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] created_port_ids = self._update_ports_for_instance( [ 658.464266] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 658.464266] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] with excutils.save_and_reraise_exception(): [ 658.464266] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 658.464266] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] self.force_reraise() [ 658.464608] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 658.464608] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] raise self.value [ 658.464608] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 658.464608] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] updated_port = self._update_port( [ 658.464608] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 658.464608] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] _ensure_no_port_binding_failure(port) [ 658.464608] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 658.464608] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] raise exception.PortBindingFailed(port_id=port['id']) [ 658.464608] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] nova.exception.PortBindingFailed: Binding failed for port af2b4893-ceaf-45d1-a1df-5e3041a748f9, please check neutron logs for more information. [ 658.464608] env[59659]: ERROR nova.compute.manager [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] [ 658.464608] env[59659]: DEBUG nova.compute.utils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Binding failed for port af2b4893-ceaf-45d1-a1df-5e3041a748f9, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 658.466234] env[59659]: DEBUG nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Build of instance 10fc8044-6912-412f-9b84-50efb0e9a398 was re-scheduled: Binding failed for port af2b4893-ceaf-45d1-a1df-5e3041a748f9, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 658.466989] env[59659]: DEBUG nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 658.467226] env[59659]: DEBUG oslo_concurrency.lockutils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Acquiring lock "refresh_cache-10fc8044-6912-412f-9b84-50efb0e9a398" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 658.467376] env[59659]: DEBUG oslo_concurrency.lockutils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Acquired lock "refresh_cache-10fc8044-6912-412f-9b84-50efb0e9a398" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 658.467533] env[59659]: DEBUG nova.network.neutron [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 658.583450] env[59659]: DEBUG nova.network.neutron [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 659.695903] env[59659]: DEBUG nova.network.neutron [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 659.708171] env[59659]: DEBUG oslo_concurrency.lockutils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Releasing lock "refresh_cache-10fc8044-6912-412f-9b84-50efb0e9a398" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 659.708394] env[59659]: DEBUG nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 659.710265] env[59659]: DEBUG nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 659.710265] env[59659]: DEBUG nova.network.neutron [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 660.021962] env[59659]: DEBUG nova.network.neutron [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 660.030718] env[59659]: DEBUG nova.network.neutron [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 660.044601] env[59659]: INFO nova.compute.manager [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] [instance: 10fc8044-6912-412f-9b84-50efb0e9a398] Took 0.34 seconds to deallocate network for instance. [ 660.172144] env[59659]: INFO nova.scheduler.client.report [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Deleted allocations for instance 10fc8044-6912-412f-9b84-50efb0e9a398 [ 660.193297] env[59659]: DEBUG oslo_concurrency.lockutils [None req-afee1a71-dd09-4d66-8b96-5a4780ed3593 tempest-ServersTestMultiNic-532910345 tempest-ServersTestMultiNic-532910345-project-member] Lock "10fc8044-6912-412f-9b84-50efb0e9a398" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 21.796s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 694.399683] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 694.416603] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 694.416603] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Starting heal instance info cache {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 694.416603] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Rebuilding the list of instances to heal {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 694.434073] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Didn't find any instances for network info cache update. {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 694.434073] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 695.027725] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 695.029779] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 695.029779] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 695.029779] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 696.028064] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 696.028064] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 696.028064] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59659) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 696.029679] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 696.046018] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.046018] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.046018] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.046018] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59659) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 696.047265] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9f5ccb7-e61a-4abb-a55e-801a39cc8b1e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.063044] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ac4b98a-5c57-482c-a778-358546373e34 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.084173] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85df0e87-8088-4235-9bdf-dbf7f778d8eb {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.092139] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-878aa786-efaf-437b-9cb1-a4cb4e884120 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.128267] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181437MB free_disk=177GB free_vcpus=48 pci_devices=None {{(pid=59659) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 696.128582] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.128903] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.195832] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59659) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 696.196019] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59659) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 696.220091] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbc2cc64-eec0-45f2-9471-8e5c863e9fb4 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.226218] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1641d525-08f4-42c7-9e43-e753e1cea126 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.258116] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c335d9e-f28f-4205-aade-ac18e72ec2e1 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.265574] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6f0e000-ff14-4dfd-af1a-91958c17ac03 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.285036] env[59659]: DEBUG nova.compute.provider_tree [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 696.308707] env[59659]: DEBUG nova.scheduler.client.report [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 696.341489] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59659) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 696.341669] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.213s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.497393] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Acquiring lock "ea968312-62ea-4f55-87e9-f91823fc14c2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.497720] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Lock "ea968312-62ea-4f55-87e9-f91823fc14c2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.513597] env[59659]: DEBUG nova.compute.manager [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 710.573873] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.574141] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.576289] env[59659]: INFO nova.compute.claims [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 710.680022] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27d3d76f-45d7-4e58-9614-897ac983210d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.688221] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc1e875a-a7df-4b86-af04-943876e67ca6 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.723896] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f34101cb-fb24-4adb-a4cb-29ef424881b4 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.731491] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01dc2dbb-15b2-48a2-ba89-e81095829aba {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.745769] env[59659]: DEBUG nova.compute.provider_tree [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Updating inventory in ProviderTree for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 710.770995] env[59659]: ERROR nova.scheduler.client.report [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [req-c336b598-f35d-4eb7-ab22-96387170c79c] Failed to update inventory to [{'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}}] for resource provider with UUID 69a84459-8a9e-4a6c-afd9-ec42e61132ce. Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict ", "code": "placement.concurrent_update", "request_id": "req-c336b598-f35d-4eb7-ab22-96387170c79c"}]} [ 710.786590] env[59659]: DEBUG nova.scheduler.client.report [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Refreshing inventories for resource provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 710.804895] env[59659]: DEBUG nova.scheduler.client.report [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Updating ProviderTree inventory for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 710.805152] env[59659]: DEBUG nova.compute.provider_tree [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Updating inventory in ProviderTree for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 710.819747] env[59659]: DEBUG nova.scheduler.client.report [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Refreshing aggregate associations for resource provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce, aggregates: None {{(pid=59659) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 710.842494] env[59659]: DEBUG nova.scheduler.client.report [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Refreshing trait associations for resource provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce, traits: COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=59659) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 710.871642] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-108f5c38-ec65-4a16-90d8-04d19ab32523 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.879464] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6aac4253-9786-44b4-a4b4-9c7d1ebeb3ce {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.911574] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24e4b1dd-7025-4106-a47e-4e58463c63c5 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.919073] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6a8649f-4050-43db-ae76-e9db0141b614 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.932604] env[59659]: DEBUG nova.compute.provider_tree [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Updating inventory in ProviderTree for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 710.973014] env[59659]: DEBUG nova.scheduler.client.report [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Updated inventory for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce with generation 17 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 710.973014] env[59659]: DEBUG nova.compute.provider_tree [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Updating resource provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce generation from 17 to 18 during operation: update_inventory {{(pid=59659) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 710.973134] env[59659]: DEBUG nova.compute.provider_tree [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Updating inventory in ProviderTree for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 710.989325] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.415s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.989325] env[59659]: DEBUG nova.compute.manager [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 711.030055] env[59659]: DEBUG nova.compute.utils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 711.031980] env[59659]: DEBUG nova.compute.manager [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Not allocating networking since 'none' was specified. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 711.047846] env[59659]: DEBUG nova.compute.manager [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 711.132645] env[59659]: DEBUG nova.compute.manager [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 711.164740] env[59659]: DEBUG nova.virt.hardware [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 711.164978] env[59659]: DEBUG nova.virt.hardware [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 711.165146] env[59659]: DEBUG nova.virt.hardware [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 711.165326] env[59659]: DEBUG nova.virt.hardware [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 711.165599] env[59659]: DEBUG nova.virt.hardware [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 711.165599] env[59659]: DEBUG nova.virt.hardware [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 711.165898] env[59659]: DEBUG nova.virt.hardware [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 711.166819] env[59659]: DEBUG nova.virt.hardware [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 711.167440] env[59659]: DEBUG nova.virt.hardware [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 711.167509] env[59659]: DEBUG nova.virt.hardware [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 711.167805] env[59659]: DEBUG nova.virt.hardware [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 711.168592] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9bd57d8-9151-450c-92c1-7dd150dffeaa {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.180342] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8181dd66-51cb-405c-a064-6afad5fc2f42 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.195044] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Instance VIF info [] {{(pid=59659) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 711.206233] env[59659]: DEBUG nova.virt.vmwareapi.vm_util [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=59659) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 711.207106] env[59659]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7d7c83ef-fba7-419c-b3c5-64d63ef9157e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.221411] env[59659]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 711.221596] env[59659]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=59659) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 711.221958] env[59659]: DEBUG nova.virt.vmwareapi.vm_util [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Folder already exists: OpenStack. Parent ref: group-v4. {{(pid=59659) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 711.222076] env[59659]: DEBUG nova.virt.vmwareapi.vm_util [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Creating folder: Project (834aaa625fd84a18b32ddef466b431a4). Parent ref: group-v293946. {{(pid=59659) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 711.222313] env[59659]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-716a91cb-1c67-413c-9f9e-225fb3103d32 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.233319] env[59659]: INFO nova.virt.vmwareapi.vm_util [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Created folder: Project (834aaa625fd84a18b32ddef466b431a4) in parent group-v293946. [ 711.233520] env[59659]: DEBUG nova.virt.vmwareapi.vm_util [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Creating folder: Instances. Parent ref: group-v293951. {{(pid=59659) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 711.233817] env[59659]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8ce04cdb-e1a5-4374-aa38-2df83a18e763 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.244444] env[59659]: INFO nova.virt.vmwareapi.vm_util [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Created folder: Instances in parent group-v293951. [ 711.244753] env[59659]: DEBUG oslo.service.loopingcall [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 711.244957] env[59659]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Creating VM on the ESX host {{(pid=59659) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 711.245177] env[59659]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3db9c589-afa8-4cd3-9308-0bedefeacefc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.265513] env[59659]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 711.265513] env[59659]: value = "task-1384537" [ 711.265513] env[59659]: _type = "Task" [ 711.265513] env[59659]: } to complete. {{(pid=59659) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 711.275632] env[59659]: DEBUG oslo_vmware.api [-] Task: {'id': task-1384537, 'name': CreateVM_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 711.783216] env[59659]: DEBUG oslo_vmware.api [-] Task: {'id': task-1384537, 'name': CreateVM_Task, 'duration_secs': 0.283246} completed successfully. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 711.784346] env[59659]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Created VM on the ESX host {{(pid=59659) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 711.784497] env[59659]: DEBUG oslo_vmware.service [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-804aded1-fb23-4f6f-b9ea-62ae6214c884 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.791570] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 711.791738] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Acquired lock "[datastore2] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 711.792518] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 711.792704] env[59659]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5ab634d2-c65a-4c12-8629-fa31a7a85253 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.798660] env[59659]: DEBUG oslo_vmware.api [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Waiting for the task: (returnval){ [ 711.798660] env[59659]: value = "session[522b4e95-5a6c-8a31-9bee-9871c53e7a49]52fe2457-ebf9-74f1-9e28-b8afda386cc3" [ 711.798660] env[59659]: _type = "Task" [ 711.798660] env[59659]: } to complete. {{(pid=59659) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 711.809259] env[59659]: DEBUG oslo_vmware.api [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Task: {'id': session[522b4e95-5a6c-8a31-9bee-9871c53e7a49]52fe2457-ebf9-74f1-9e28-b8afda386cc3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 711.838808] env[59659]: DEBUG oslo_concurrency.lockutils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Acquiring lock "9759f284-26e2-466e-9504-ffb63a359f27" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.839041] env[59659]: DEBUG oslo_concurrency.lockutils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Lock "9759f284-26e2-466e-9504-ffb63a359f27" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.852584] env[59659]: DEBUG nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 711.916503] env[59659]: DEBUG oslo_concurrency.lockutils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.916503] env[59659]: DEBUG oslo_concurrency.lockutils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.916503] env[59659]: INFO nova.compute.claims [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 712.045286] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12354226-2c48-4011-bf57-7a698886a672 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.053198] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7df7dfda-b6d8-47d7-9716-fc4db3fc559d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.092245] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbcf5023-0c2a-4cfc-a850-045c3c84176a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.101264] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09205ab7-97a4-490f-9a3c-42ea8c379ed4 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.123724] env[59659]: DEBUG nova.compute.provider_tree [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 712.143172] env[59659]: DEBUG nova.scheduler.client.report [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 712.160480] env[59659]: DEBUG oslo_concurrency.lockutils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 712.163700] env[59659]: DEBUG nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 712.207632] env[59659]: DEBUG nova.compute.utils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 712.207632] env[59659]: DEBUG nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 712.209214] env[59659]: DEBUG nova.network.neutron [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 712.219723] env[59659]: DEBUG nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 712.318138] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Releasing lock "[datastore2] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 712.318138] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Processing image 0fa786c9-f55e-46dc-b725-aa456ca9ff53 {{(pid=59659) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 712.318138] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53/0fa786c9-f55e-46dc-b725-aa456ca9ff53.vmdk" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 712.318138] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Acquired lock "[datastore2] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53/0fa786c9-f55e-46dc-b725-aa456ca9ff53.vmdk" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 712.318284] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59659) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 712.318284] env[59659]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-60d61f36-50d8-4b73-abb5-95f85c5a5346 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.331958] env[59659]: DEBUG nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 712.337713] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59659) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 712.337976] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59659) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 712.338678] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a292f29e-1c59-4a7a-934b-76bb85a6f48f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.349927] env[59659]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-178ea975-4489-4c27-bf1c-2823a9b140fb {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.359443] env[59659]: DEBUG oslo_vmware.api [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Waiting for the task: (returnval){ [ 712.359443] env[59659]: value = "session[522b4e95-5a6c-8a31-9bee-9871c53e7a49]528153b9-d862-fd39-1ed9-281668110add" [ 712.359443] env[59659]: _type = "Task" [ 712.359443] env[59659]: } to complete. {{(pid=59659) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 712.368505] env[59659]: DEBUG nova.virt.hardware [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 712.368727] env[59659]: DEBUG nova.virt.hardware [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 712.368872] env[59659]: DEBUG nova.virt.hardware [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 712.369081] env[59659]: DEBUG nova.virt.hardware [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 712.369197] env[59659]: DEBUG nova.virt.hardware [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 712.369331] env[59659]: DEBUG nova.virt.hardware [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 712.369525] env[59659]: DEBUG nova.virt.hardware [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 712.369669] env[59659]: DEBUG nova.virt.hardware [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 712.369821] env[59659]: DEBUG nova.virt.hardware [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 712.369969] env[59659]: DEBUG nova.virt.hardware [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 712.370138] env[59659]: DEBUG nova.virt.hardware [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 712.370976] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cebc7a0a-c588-42dc-b2d6-12b47711eb90 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.377143] env[59659]: DEBUG oslo_vmware.api [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Task: {'id': session[522b4e95-5a6c-8a31-9bee-9871c53e7a49]528153b9-d862-fd39-1ed9-281668110add, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 712.382112] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7dc5e4c-2332-4b83-a275-969d3ee15984 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.458102] env[59659]: DEBUG oslo_concurrency.lockutils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Acquiring lock "fdd34513-15af-4294-8a8a-e3b095188eda" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 712.458369] env[59659]: DEBUG oslo_concurrency.lockutils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Lock "fdd34513-15af-4294-8a8a-e3b095188eda" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 712.477591] env[59659]: DEBUG nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 712.534238] env[59659]: DEBUG oslo_concurrency.lockutils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 712.534238] env[59659]: DEBUG oslo_concurrency.lockutils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 712.535287] env[59659]: INFO nova.compute.claims [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 712.612691] env[59659]: DEBUG nova.policy [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '63ed5c849ba14360bfc1356ce027a979', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e235144751dc4f10be76b379d3e6e53d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 712.659941] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d3bc595-6244-495c-a704-e5b8bb426256 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.669607] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fac7e598-8d5a-4875-aea8-55ddbf67bea2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.701835] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cee67b67-c481-4dd7-9053-5cf4956fb3c7 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.709340] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18c54dde-1f94-46d8-9554-2be6aec688f1 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.725235] env[59659]: DEBUG nova.compute.provider_tree [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 712.741679] env[59659]: DEBUG nova.scheduler.client.report [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 712.766752] env[59659]: DEBUG oslo_concurrency.lockutils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.233s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 712.767337] env[59659]: DEBUG nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 712.809359] env[59659]: DEBUG nova.compute.utils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 712.811582] env[59659]: DEBUG nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 712.811582] env[59659]: DEBUG nova.network.neutron [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 712.821320] env[59659]: DEBUG nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 712.874375] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Preparing fetch location {{(pid=59659) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 712.874663] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Creating directory with path [datastore2] vmware_temp/502a1631-b8ed-4837-9429-7e90b81eeee6/0fa786c9-f55e-46dc-b725-aa456ca9ff53 {{(pid=59659) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 712.874978] env[59659]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cbe91aff-abb9-42dd-850c-2d58c6ca7b4b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.896185] env[59659]: DEBUG nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 712.907072] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Created directory with path [datastore2] vmware_temp/502a1631-b8ed-4837-9429-7e90b81eeee6/0fa786c9-f55e-46dc-b725-aa456ca9ff53 {{(pid=59659) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 712.907278] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Fetch image to [datastore2] vmware_temp/502a1631-b8ed-4837-9429-7e90b81eeee6/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk {{(pid=59659) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 712.907454] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Downloading image file data 0fa786c9-f55e-46dc-b725-aa456ca9ff53 to [datastore2] vmware_temp/502a1631-b8ed-4837-9429-7e90b81eeee6/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59659) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 712.908254] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28519df4-b877-42a2-bcfa-bb7762d49601 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.920345] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47061629-2e5c-4232-a0d2-299f5900bf8b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.925214] env[59659]: DEBUG nova.virt.hardware [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 712.925436] env[59659]: DEBUG nova.virt.hardware [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 712.925584] env[59659]: DEBUG nova.virt.hardware [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 712.925762] env[59659]: DEBUG nova.virt.hardware [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 712.925899] env[59659]: DEBUG nova.virt.hardware [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 712.926045] env[59659]: DEBUG nova.virt.hardware [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 712.926243] env[59659]: DEBUG nova.virt.hardware [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 712.926392] env[59659]: DEBUG nova.virt.hardware [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 712.926545] env[59659]: DEBUG nova.virt.hardware [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 712.926699] env[59659]: DEBUG nova.virt.hardware [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 712.926866] env[59659]: DEBUG nova.virt.hardware [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 712.927683] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9edb5295-1bbe-4dda-8d1d-a1a2e82375ce {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.941125] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f08cbd0-aa18-4fed-8db5-b3972d12cb4e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.947019] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb673c6d-c3ae-469c-aa01-cd734955e50b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.992785] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f6c9eeb-661c-43a4-b4d3-28389ee398bc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.000145] env[59659]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-885497a4-963d-4ee2-8b16-3bda4f668f6c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.050938] env[59659]: DEBUG nova.policy [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e9d0f0c1831a4246b24095e2592c8cfc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '580e685106a749aca0f9769c5c269798', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 713.088357] env[59659]: DEBUG nova.virt.vmwareapi.images [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Downloading image file data 0fa786c9-f55e-46dc-b725-aa456ca9ff53 to the data store datastore2 {{(pid=59659) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 713.139249] env[59659]: DEBUG oslo_vmware.rw_handles [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/502a1631-b8ed-4837-9429-7e90b81eeee6/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59659) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 713.207347] env[59659]: DEBUG oslo_vmware.rw_handles [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Completed reading data from the image iterator. {{(pid=59659) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 713.207552] env[59659]: DEBUG oslo_vmware.rw_handles [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/502a1631-b8ed-4837-9429-7e90b81eeee6/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59659) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 713.628779] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Acquiring lock "62ade33c-5283-432d-872c-cc162254317d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.629021] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Lock "62ade33c-5283-432d-872c-cc162254317d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.652876] env[59659]: DEBUG nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 713.706792] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.707104] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.708738] env[59659]: INFO nova.compute.claims [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 713.823047] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de416359-dd65-4201-9f2c-5221606aa01e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.832222] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-083a9fed-f0dd-4f28-926d-95d12c581120 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.872678] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49ab9af2-0064-4e01-bc2a-6e32351c8afd {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.881258] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec9aa065-a3bf-4425-9178-b37762d95d8a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.896927] env[59659]: DEBUG nova.compute.provider_tree [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 713.905433] env[59659]: DEBUG nova.scheduler.client.report [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 713.918991] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.212s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.919483] env[59659]: DEBUG nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 713.955380] env[59659]: DEBUG nova.compute.utils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 713.957084] env[59659]: DEBUG nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 713.957266] env[59659]: DEBUG nova.network.neutron [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 713.968350] env[59659]: DEBUG nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 714.039954] env[59659]: DEBUG nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 714.067328] env[59659]: DEBUG nova.virt.hardware [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 714.067467] env[59659]: DEBUG nova.virt.hardware [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 714.067604] env[59659]: DEBUG nova.virt.hardware [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 714.067790] env[59659]: DEBUG nova.virt.hardware [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 714.067954] env[59659]: DEBUG nova.virt.hardware [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 714.068431] env[59659]: DEBUG nova.virt.hardware [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 714.068681] env[59659]: DEBUG nova.virt.hardware [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 714.068845] env[59659]: DEBUG nova.virt.hardware [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 714.069033] env[59659]: DEBUG nova.virt.hardware [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 714.069202] env[59659]: DEBUG nova.virt.hardware [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 714.069399] env[59659]: DEBUG nova.virt.hardware [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 714.070297] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29d15fcc-f596-42d1-8e43-470ac017b13c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.079184] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0f8470d-dc58-42c6-8b10-7037753cc2a5 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.219828] env[59659]: DEBUG nova.policy [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f3796dbcf6c46a4be35270ec92c09da', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '20659337b9974b26a9f4123181bd14ce', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 714.557251] env[59659]: DEBUG nova.network.neutron [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Successfully created port: f96d3967-9568-40ee-9bd9-a29f08464a46 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 714.904404] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Acquiring lock "ce3bd633-4538-428d-9258-9222c3c72edd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.904668] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Lock "ce3bd633-4538-428d-9258-9222c3c72edd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.918018] env[59659]: DEBUG nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 714.985078] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.985353] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.987047] env[59659]: INFO nova.compute.claims [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 715.135826] env[59659]: DEBUG nova.network.neutron [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Successfully created port: 711f0237-f381-4494-9f23-9a9c2e51d498 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 715.148236] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-387176c8-4174-4dd5-a145-2429e9860bb8 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.156187] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-259dceeb-2c55-426b-a4ef-5a442435f2ec {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.192521] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b93bc482-773e-4d21-a79e-dc67fa998c38 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.200261] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ff45a1f-661f-4042-a96e-6938b4bfe6e1 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.215895] env[59659]: DEBUG nova.compute.provider_tree [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 715.225750] env[59659]: DEBUG nova.scheduler.client.report [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 715.238024] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.252s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.240102] env[59659]: DEBUG nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 715.288196] env[59659]: DEBUG nova.compute.utils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 715.289149] env[59659]: DEBUG nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 715.289371] env[59659]: DEBUG nova.network.neutron [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 715.301483] env[59659]: DEBUG nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 715.380174] env[59659]: DEBUG nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 715.410112] env[59659]: DEBUG nova.virt.hardware [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 715.410410] env[59659]: DEBUG nova.virt.hardware [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 715.410857] env[59659]: DEBUG nova.virt.hardware [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 715.411147] env[59659]: DEBUG nova.virt.hardware [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 715.411365] env[59659]: DEBUG nova.virt.hardware [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 715.411572] env[59659]: DEBUG nova.virt.hardware [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 715.414553] env[59659]: DEBUG nova.virt.hardware [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 715.414553] env[59659]: DEBUG nova.virt.hardware [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 715.414553] env[59659]: DEBUG nova.virt.hardware [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 715.414553] env[59659]: DEBUG nova.virt.hardware [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 715.414951] env[59659]: DEBUG nova.virt.hardware [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 715.414951] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44050d92-75bd-4a79-94bb-a0bb46b14d8d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.425123] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5a8744e-91c0-4468-9357-be16336471d7 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 715.697584] env[59659]: DEBUG nova.policy [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4dab02812c6c4fed99d165cd787c842c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '291b1c241186446bb53562f866315ad9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 716.243989] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquiring lock "f7fc4465-02a5-4715-b50b-04172f097350" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.247660] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "f7fc4465-02a5-4715-b50b-04172f097350" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.261405] env[59659]: DEBUG nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 716.264686] env[59659]: DEBUG nova.network.neutron [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Successfully created port: 88dc529c-e6b7-4d6e-b7a7-17e32a9977fd {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 716.269703] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Acquiring lock "588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.270158] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Lock "588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.282910] env[59659]: DEBUG nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 716.330377] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.331071] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.332414] env[59659]: INFO nova.compute.claims [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 716.345066] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.514759] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a9262b9-e265-4a11-b66a-1b2adb6d99c2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.523111] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8c08b4b-a7f5-4b5a-9b3f-930c05380025 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.558950] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b809e12-ddc4-4035-a2d9-7ae0e86a7e6b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.568486] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcbb4836-d417-48c4-8582-ee9a237676d6 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.582183] env[59659]: DEBUG nova.compute.provider_tree [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 716.590087] env[59659]: DEBUG nova.scheduler.client.report [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 716.606192] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.275s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.606727] env[59659]: DEBUG nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 716.609305] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.264s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.610777] env[59659]: INFO nova.compute.claims [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 716.640063] env[59659]: DEBUG nova.compute.utils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 716.641813] env[59659]: DEBUG nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 716.642018] env[59659]: DEBUG nova.network.neutron [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 716.657017] env[59659]: DEBUG nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 716.771049] env[59659]: DEBUG nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 716.794878] env[59659]: DEBUG nova.virt.hardware [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 716.795029] env[59659]: DEBUG nova.virt.hardware [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 716.795185] env[59659]: DEBUG nova.virt.hardware [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 716.795369] env[59659]: DEBUG nova.virt.hardware [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 716.795510] env[59659]: DEBUG nova.virt.hardware [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 716.795651] env[59659]: DEBUG nova.virt.hardware [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 716.795847] env[59659]: DEBUG nova.virt.hardware [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 716.796078] env[59659]: DEBUG nova.virt.hardware [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 716.796176] env[59659]: DEBUG nova.virt.hardware [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 716.796335] env[59659]: DEBUG nova.virt.hardware [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 716.796500] env[59659]: DEBUG nova.virt.hardware [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 716.797390] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb5af5ee-4a2b-43c6-be68-cedc469dcf8e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.805318] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3763434-81bf-4161-8ed2-154723c12974 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.810834] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9aa64cdc-1819-4853-aca6-1d45fd21b0c0 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.826132] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-926e1ffd-c7eb-48c8-b504-74681e578422 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.858039] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c34762fc-9b37-4098-9a7d-3d4dcdf534f5 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.865461] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abbb68ca-5f22-413c-96cc-6cc03e655083 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.878723] env[59659]: DEBUG nova.compute.provider_tree [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 716.887725] env[59659]: DEBUG nova.scheduler.client.report [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 716.902368] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.902912] env[59659]: DEBUG nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 716.936279] env[59659]: DEBUG nova.compute.utils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 716.937882] env[59659]: DEBUG nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 716.938027] env[59659]: DEBUG nova.network.neutron [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 716.957033] env[59659]: DEBUG nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 717.033026] env[59659]: DEBUG nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 717.064120] env[59659]: DEBUG nova.virt.hardware [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 717.064120] env[59659]: DEBUG nova.virt.hardware [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 717.064120] env[59659]: DEBUG nova.virt.hardware [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 717.064288] env[59659]: DEBUG nova.virt.hardware [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 717.064288] env[59659]: DEBUG nova.virt.hardware [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 717.064288] env[59659]: DEBUG nova.virt.hardware [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 717.065870] env[59659]: DEBUG nova.virt.hardware [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 717.066199] env[59659]: DEBUG nova.virt.hardware [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 717.066707] env[59659]: DEBUG nova.virt.hardware [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 717.066970] env[59659]: DEBUG nova.virt.hardware [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 717.067251] env[59659]: DEBUG nova.virt.hardware [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 717.068206] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf246218-a646-45ef-a5f7-f7ce7492e9c1 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.078458] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-293e44f7-7669-45b4-a65a-5da3c493f3b8 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.221955] env[59659]: DEBUG nova.network.neutron [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Successfully created port: 808c0d62-376a-4bfe-a50c-92a1fa8874f0 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 717.224357] env[59659]: DEBUG nova.policy [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8090dd0e116e4ac89aeb07e25bc22927', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8689e7ba4d544dfcbbdf7c864cb3f823', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 717.267436] env[59659]: DEBUG nova.policy [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7c2b5f71e3034e5f90220c5ebf1bb6d9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dd2add382fb34e309cc9b0acd9403ef6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 719.721466] env[59659]: DEBUG nova.network.neutron [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Successfully created port: 8eed8767-c291-4cbf-8803-07bd0caa822b {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 719.790174] env[59659]: DEBUG nova.network.neutron [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Successfully created port: 66b70acf-8cb8-4462-b90d-f49fb9026d15 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 719.924738] env[59659]: DEBUG oslo_concurrency.lockutils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Acquiring lock "1f7f6276-cfe5-4427-90b6-893e7ad6cffe" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 719.924974] env[59659]: DEBUG oslo_concurrency.lockutils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Lock "1f7f6276-cfe5-4427-90b6-893e7ad6cffe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 719.959526] env[59659]: DEBUG nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 720.018442] env[59659]: DEBUG oslo_concurrency.lockutils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.018702] env[59659]: DEBUG oslo_concurrency.lockutils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.020350] env[59659]: INFO nova.compute.claims [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 720.203619] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de1224eb-5810-4cb0-a22d-49289065d3bc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.211733] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91f34445-be57-4119-a1dc-268d578149a6 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.247705] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb76171d-c37c-439d-b48e-88dfbe369a93 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.255309] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bc62b74-1c62-4d0e-b855-96825861f1b5 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.269879] env[59659]: DEBUG nova.compute.provider_tree [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 720.279706] env[59659]: DEBUG nova.scheduler.client.report [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 720.295797] env[59659]: DEBUG oslo_concurrency.lockutils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.277s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.296168] env[59659]: DEBUG nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 720.331948] env[59659]: DEBUG nova.compute.utils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 720.333594] env[59659]: DEBUG nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 720.333800] env[59659]: DEBUG nova.network.neutron [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 720.344377] env[59659]: DEBUG nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 720.376290] env[59659]: INFO nova.virt.block_device [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Booting with volume a29828fb-9b19-4403-8dcf-5f30e3e7298c at /dev/sda [ 720.417265] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-bf1085ff-657b-4ca6-87fc-d99add719f4f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.425429] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8e72f3e-1b77-4714-a828-a2dd339f4d24 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.447552] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c402af41-fc61-4902-9df1-d1d88b3667d3 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.454874] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d43f641-638a-4061-be43-a6e55a23bfa6 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.477076] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2dea25c-a633-4494-80a4-180d44a8069f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.483780] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1977056-5020-49dc-87ab-879f3b4fc9aa {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.497577] env[59659]: DEBUG nova.virt.block_device [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Updating existing volume attachment record: bc4a410a-054e-4f2e-a8f1-0dcd32213ee5 {{(pid=59659) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 720.722344] env[59659]: DEBUG nova.policy [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '18cab6bcb60c42d48d6cb367a7d2406c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b04115b4b0ed49aea28a1bee3ee07a27', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 720.746685] env[59659]: DEBUG nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 720.746799] env[59659]: DEBUG nova.virt.hardware [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 720.747195] env[59659]: DEBUG nova.virt.hardware [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 720.747412] env[59659]: DEBUG nova.virt.hardware [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 720.748044] env[59659]: DEBUG nova.virt.hardware [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 720.748235] env[59659]: DEBUG nova.virt.hardware [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 720.748623] env[59659]: DEBUG nova.virt.hardware [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 720.748897] env[59659]: DEBUG nova.virt.hardware [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 720.749119] env[59659]: DEBUG nova.virt.hardware [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 720.749330] env[59659]: DEBUG nova.virt.hardware [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 720.749788] env[59659]: DEBUG nova.virt.hardware [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 720.750142] env[59659]: DEBUG nova.virt.hardware [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 720.751376] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d94960c-45b9-4856-bb09-d05ce8aa38e8 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.762175] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c67af42-7506-4f3e-8ef4-52046b66960d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.882172] env[59659]: ERROR nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 711f0237-f381-4494-9f23-9a9c2e51d498, please check neutron logs for more information. [ 720.882172] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 720.882172] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 720.882172] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 720.882172] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 720.882172] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 720.882172] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 720.882172] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 720.882172] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 720.882172] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 720.882172] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 720.882172] env[59659]: ERROR nova.compute.manager raise self.value [ 720.882172] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 720.882172] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 720.882172] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 720.882172] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 720.882614] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 720.882614] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 720.882614] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 711f0237-f381-4494-9f23-9a9c2e51d498, please check neutron logs for more information. [ 720.882614] env[59659]: ERROR nova.compute.manager [ 720.882614] env[59659]: Traceback (most recent call last): [ 720.882614] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 720.882614] env[59659]: listener.cb(fileno) [ 720.882614] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 720.882614] env[59659]: result = function(*args, **kwargs) [ 720.882614] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 720.882614] env[59659]: return func(*args, **kwargs) [ 720.882614] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 720.882614] env[59659]: raise e [ 720.882614] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 720.882614] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 720.882614] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 720.882614] env[59659]: created_port_ids = self._update_ports_for_instance( [ 720.882614] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 720.882614] env[59659]: with excutils.save_and_reraise_exception(): [ 720.882614] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 720.882614] env[59659]: self.force_reraise() [ 720.882614] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 720.882614] env[59659]: raise self.value [ 720.882614] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 720.882614] env[59659]: updated_port = self._update_port( [ 720.882614] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 720.882614] env[59659]: _ensure_no_port_binding_failure(port) [ 720.882614] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 720.882614] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 720.883349] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 711f0237-f381-4494-9f23-9a9c2e51d498, please check neutron logs for more information. [ 720.883349] env[59659]: Removing descriptor: 12 [ 720.883409] env[59659]: ERROR nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 711f0237-f381-4494-9f23-9a9c2e51d498, please check neutron logs for more information. [ 720.883409] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Traceback (most recent call last): [ 720.883409] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 720.883409] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] yield resources [ 720.883409] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 720.883409] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] self.driver.spawn(context, instance, image_meta, [ 720.883409] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 720.883409] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] self._vmops.spawn(context, instance, image_meta, injected_files, [ 720.883409] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 720.883409] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] vm_ref = self.build_virtual_machine(instance, [ 720.883409] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 720.883682] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] vif_infos = vmwarevif.get_vif_info(self._session, [ 720.883682] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 720.883682] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] for vif in network_info: [ 720.883682] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 720.883682] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] return self._sync_wrapper(fn, *args, **kwargs) [ 720.883682] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 720.883682] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] self.wait() [ 720.883682] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 720.883682] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] self[:] = self._gt.wait() [ 720.883682] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 720.883682] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] return self._exit_event.wait() [ 720.883682] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 720.883682] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] result = hub.switch() [ 720.883682] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 720.884053] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] return self.greenlet.switch() [ 720.884053] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 720.884053] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] result = function(*args, **kwargs) [ 720.884053] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 720.884053] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] return func(*args, **kwargs) [ 720.884053] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 720.884053] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] raise e [ 720.884053] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 720.884053] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] nwinfo = self.network_api.allocate_for_instance( [ 720.884053] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 720.884053] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] created_port_ids = self._update_ports_for_instance( [ 720.884053] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 720.884053] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] with excutils.save_and_reraise_exception(): [ 720.884400] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 720.884400] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] self.force_reraise() [ 720.884400] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 720.884400] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] raise self.value [ 720.884400] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 720.884400] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] updated_port = self._update_port( [ 720.884400] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 720.884400] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] _ensure_no_port_binding_failure(port) [ 720.884400] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 720.884400] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] raise exception.PortBindingFailed(port_id=port['id']) [ 720.884400] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] nova.exception.PortBindingFailed: Binding failed for port 711f0237-f381-4494-9f23-9a9c2e51d498, please check neutron logs for more information. [ 720.884400] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] [ 720.884704] env[59659]: INFO nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Terminating instance [ 720.890742] env[59659]: DEBUG oslo_concurrency.lockutils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Acquiring lock "refresh_cache-fdd34513-15af-4294-8a8a-e3b095188eda" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 720.890742] env[59659]: DEBUG oslo_concurrency.lockutils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Acquired lock "refresh_cache-fdd34513-15af-4294-8a8a-e3b095188eda" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 720.890742] env[59659]: DEBUG nova.network.neutron [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 720.980674] env[59659]: DEBUG nova.network.neutron [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 721.216594] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Acquiring lock "ad58bbc3-1ec8-4567-ba07-c8161bcc8380" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.216812] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Lock "ad58bbc3-1ec8-4567-ba07-c8161bcc8380" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.231166] env[59659]: DEBUG nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 721.294298] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.294544] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.297183] env[59659]: INFO nova.compute.claims [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 721.485551] env[59659]: DEBUG nova.network.neutron [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.497031] env[59659]: DEBUG oslo_concurrency.lockutils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Releasing lock "refresh_cache-fdd34513-15af-4294-8a8a-e3b095188eda" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 721.497509] env[59659]: DEBUG nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 721.497727] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 721.498565] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-07819da4-9401-44e4-aa2e-c507bcf6dfa7 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.507341] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-949f5a0c-7285-49f4-8093-5fd1ac0dfa07 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.526024] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00987b3c-e9ef-4205-b572-065bc0550abb {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.537022] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75d9dd3b-31fa-4a4a-a6c9-88559579d677 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.541301] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fdd34513-15af-4294-8a8a-e3b095188eda could not be found. [ 721.541520] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 721.541697] env[59659]: INFO nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Took 0.04 seconds to destroy the instance on the hypervisor. [ 721.541932] env[59659]: DEBUG oslo.service.loopingcall [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 721.543127] env[59659]: DEBUG nova.compute.manager [-] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 721.543127] env[59659]: DEBUG nova.network.neutron [-] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 721.572726] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46193727-0489-41d6-9c23-a92d8b69d18c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.578752] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cae4c0f8-f30e-48ba-b278-94c0282f8977 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.592929] env[59659]: DEBUG nova.compute.provider_tree [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 721.606036] env[59659]: DEBUG nova.scheduler.client.report [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 721.621406] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.327s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.621961] env[59659]: DEBUG nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 721.660757] env[59659]: DEBUG nova.compute.utils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 721.665611] env[59659]: DEBUG nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 721.665611] env[59659]: DEBUG nova.network.neutron [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 721.672069] env[59659]: DEBUG nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 721.749089] env[59659]: DEBUG nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 721.774759] env[59659]: DEBUG nova.virt.hardware [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 721.775009] env[59659]: DEBUG nova.virt.hardware [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 721.775181] env[59659]: DEBUG nova.virt.hardware [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 721.775365] env[59659]: DEBUG nova.virt.hardware [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 721.775504] env[59659]: DEBUG nova.virt.hardware [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 721.775644] env[59659]: DEBUG nova.virt.hardware [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 721.775846] env[59659]: DEBUG nova.virt.hardware [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 721.776187] env[59659]: DEBUG nova.virt.hardware [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 721.776451] env[59659]: DEBUG nova.virt.hardware [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 721.776656] env[59659]: DEBUG nova.virt.hardware [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 721.776858] env[59659]: DEBUG nova.virt.hardware [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 721.777747] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b1ae10c-8b9e-4827-bbec-3a5dfe689420 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.788294] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc4ac493-d43b-4bda-a792-6008cb84c50d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.822756] env[59659]: DEBUG nova.network.neutron [-] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 721.830217] env[59659]: DEBUG nova.network.neutron [-] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.840176] env[59659]: INFO nova.compute.manager [-] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Took 0.30 seconds to deallocate network for instance. [ 721.842857] env[59659]: DEBUG nova.compute.claims [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 721.842857] env[59659]: DEBUG oslo_concurrency.lockutils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.843090] env[59659]: DEBUG oslo_concurrency.lockutils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 722.020893] env[59659]: ERROR nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 88dc529c-e6b7-4d6e-b7a7-17e32a9977fd, please check neutron logs for more information. [ 722.020893] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 722.020893] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 722.020893] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 722.020893] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 722.020893] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 722.020893] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 722.020893] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 722.020893] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 722.020893] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 722.020893] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 722.020893] env[59659]: ERROR nova.compute.manager raise self.value [ 722.020893] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 722.020893] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 722.020893] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 722.020893] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 722.021904] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 722.021904] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 722.021904] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 88dc529c-e6b7-4d6e-b7a7-17e32a9977fd, please check neutron logs for more information. [ 722.021904] env[59659]: ERROR nova.compute.manager [ 722.021904] env[59659]: Traceback (most recent call last): [ 722.021904] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 722.021904] env[59659]: listener.cb(fileno) [ 722.021904] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 722.021904] env[59659]: result = function(*args, **kwargs) [ 722.021904] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 722.021904] env[59659]: return func(*args, **kwargs) [ 722.021904] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 722.021904] env[59659]: raise e [ 722.021904] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 722.021904] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 722.021904] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 722.021904] env[59659]: created_port_ids = self._update_ports_for_instance( [ 722.021904] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 722.021904] env[59659]: with excutils.save_and_reraise_exception(): [ 722.021904] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 722.021904] env[59659]: self.force_reraise() [ 722.021904] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 722.021904] env[59659]: raise self.value [ 722.021904] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 722.021904] env[59659]: updated_port = self._update_port( [ 722.021904] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 722.021904] env[59659]: _ensure_no_port_binding_failure(port) [ 722.021904] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 722.021904] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 722.023987] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 88dc529c-e6b7-4d6e-b7a7-17e32a9977fd, please check neutron logs for more information. [ 722.023987] env[59659]: Removing descriptor: 16 [ 722.023987] env[59659]: ERROR nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 88dc529c-e6b7-4d6e-b7a7-17e32a9977fd, please check neutron logs for more information. [ 722.023987] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] Traceback (most recent call last): [ 722.023987] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 722.023987] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] yield resources [ 722.023987] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 722.023987] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] self.driver.spawn(context, instance, image_meta, [ 722.023987] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 722.023987] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 722.023987] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 722.023987] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] vm_ref = self.build_virtual_machine(instance, [ 722.024408] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 722.024408] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] vif_infos = vmwarevif.get_vif_info(self._session, [ 722.024408] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 722.024408] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] for vif in network_info: [ 722.024408] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 722.024408] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] return self._sync_wrapper(fn, *args, **kwargs) [ 722.024408] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 722.024408] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] self.wait() [ 722.024408] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 722.024408] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] self[:] = self._gt.wait() [ 722.024408] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 722.024408] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] return self._exit_event.wait() [ 722.024408] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 722.024408] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] result = hub.switch() [ 722.024887] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 722.024887] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] return self.greenlet.switch() [ 722.024887] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 722.024887] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] result = function(*args, **kwargs) [ 722.024887] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 722.024887] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] return func(*args, **kwargs) [ 722.024887] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 722.024887] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] raise e [ 722.024887] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 722.024887] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] nwinfo = self.network_api.allocate_for_instance( [ 722.024887] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 722.024887] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] created_port_ids = self._update_ports_for_instance( [ 722.024887] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 722.029129] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] with excutils.save_and_reraise_exception(): [ 722.029129] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 722.029129] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] self.force_reraise() [ 722.029129] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 722.029129] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] raise self.value [ 722.029129] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 722.029129] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] updated_port = self._update_port( [ 722.029129] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 722.029129] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] _ensure_no_port_binding_failure(port) [ 722.029129] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 722.029129] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] raise exception.PortBindingFailed(port_id=port['id']) [ 722.029129] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] nova.exception.PortBindingFailed: Binding failed for port 88dc529c-e6b7-4d6e-b7a7-17e32a9977fd, please check neutron logs for more information. [ 722.029129] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] [ 722.030609] env[59659]: INFO nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Terminating instance [ 722.030609] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Acquiring lock "refresh_cache-62ade33c-5283-432d-872c-cc162254317d" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 722.030609] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Acquired lock "refresh_cache-62ade33c-5283-432d-872c-cc162254317d" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 722.030609] env[59659]: DEBUG nova.network.neutron [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 722.047981] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a60bf30b-377d-458d-a3a8-2f0834d822bc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.056934] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-660e0e71-c004-4e5e-be09-d734a23a4cf9 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.088777] env[59659]: DEBUG nova.network.neutron [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 722.094283] env[59659]: ERROR nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port f96d3967-9568-40ee-9bd9-a29f08464a46, please check neutron logs for more information. [ 722.094283] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 722.094283] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 722.094283] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 722.094283] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 722.094283] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 722.094283] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 722.094283] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 722.094283] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 722.094283] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 722.094283] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 722.094283] env[59659]: ERROR nova.compute.manager raise self.value [ 722.094283] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 722.094283] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 722.094283] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 722.094283] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 722.094727] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 722.094727] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 722.094727] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port f96d3967-9568-40ee-9bd9-a29f08464a46, please check neutron logs for more information. [ 722.094727] env[59659]: ERROR nova.compute.manager [ 722.094727] env[59659]: Traceback (most recent call last): [ 722.094727] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 722.094727] env[59659]: listener.cb(fileno) [ 722.094727] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 722.094727] env[59659]: result = function(*args, **kwargs) [ 722.094727] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 722.094727] env[59659]: return func(*args, **kwargs) [ 722.094727] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 722.094727] env[59659]: raise e [ 722.094727] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 722.094727] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 722.094727] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 722.094727] env[59659]: created_port_ids = self._update_ports_for_instance( [ 722.094727] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 722.094727] env[59659]: with excutils.save_and_reraise_exception(): [ 722.094727] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 722.094727] env[59659]: self.force_reraise() [ 722.094727] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 722.094727] env[59659]: raise self.value [ 722.094727] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 722.094727] env[59659]: updated_port = self._update_port( [ 722.094727] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 722.094727] env[59659]: _ensure_no_port_binding_failure(port) [ 722.094727] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 722.094727] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 722.095467] env[59659]: nova.exception.PortBindingFailed: Binding failed for port f96d3967-9568-40ee-9bd9-a29f08464a46, please check neutron logs for more information. [ 722.095467] env[59659]: Removing descriptor: 14 [ 722.095467] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-765cab3e-d495-4a76-8bde-e1f86c46ae62 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.095794] env[59659]: ERROR nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port f96d3967-9568-40ee-9bd9-a29f08464a46, please check neutron logs for more information. [ 722.095794] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Traceback (most recent call last): [ 722.095794] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 722.095794] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] yield resources [ 722.095794] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 722.095794] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] self.driver.spawn(context, instance, image_meta, [ 722.095794] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 722.095794] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] self._vmops.spawn(context, instance, image_meta, injected_files, [ 722.095794] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 722.095794] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] vm_ref = self.build_virtual_machine(instance, [ 722.095794] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 722.096086] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] vif_infos = vmwarevif.get_vif_info(self._session, [ 722.096086] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 722.096086] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] for vif in network_info: [ 722.096086] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 722.096086] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] return self._sync_wrapper(fn, *args, **kwargs) [ 722.096086] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 722.096086] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] self.wait() [ 722.096086] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 722.096086] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] self[:] = self._gt.wait() [ 722.096086] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 722.096086] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] return self._exit_event.wait() [ 722.096086] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 722.096086] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] result = hub.switch() [ 722.096086] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 722.096410] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] return self.greenlet.switch() [ 722.096410] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 722.096410] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] result = function(*args, **kwargs) [ 722.096410] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 722.096410] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] return func(*args, **kwargs) [ 722.096410] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 722.096410] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] raise e [ 722.096410] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 722.096410] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] nwinfo = self.network_api.allocate_for_instance( [ 722.096410] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 722.096410] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] created_port_ids = self._update_ports_for_instance( [ 722.096410] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 722.096410] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] with excutils.save_and_reraise_exception(): [ 722.096757] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 722.096757] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] self.force_reraise() [ 722.096757] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 722.096757] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] raise self.value [ 722.096757] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 722.096757] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] updated_port = self._update_port( [ 722.096757] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 722.096757] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] _ensure_no_port_binding_failure(port) [ 722.096757] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 722.096757] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] raise exception.PortBindingFailed(port_id=port['id']) [ 722.096757] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] nova.exception.PortBindingFailed: Binding failed for port f96d3967-9568-40ee-9bd9-a29f08464a46, please check neutron logs for more information. [ 722.096757] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] [ 722.097039] env[59659]: INFO nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Terminating instance [ 722.099140] env[59659]: DEBUG oslo_concurrency.lockutils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Acquiring lock "refresh_cache-9759f284-26e2-466e-9504-ffb63a359f27" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 722.099318] env[59659]: DEBUG oslo_concurrency.lockutils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Acquired lock "refresh_cache-9759f284-26e2-466e-9504-ffb63a359f27" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 722.099480] env[59659]: DEBUG nova.network.neutron [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 722.103670] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-388849ee-e807-4b58-8e46-983e75f1847c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.118314] env[59659]: DEBUG nova.compute.provider_tree [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 722.126963] env[59659]: DEBUG nova.scheduler.client.report [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 722.141408] env[59659]: DEBUG oslo_concurrency.lockutils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.298s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 722.141692] env[59659]: ERROR nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 711f0237-f381-4494-9f23-9a9c2e51d498, please check neutron logs for more information. [ 722.141692] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Traceback (most recent call last): [ 722.141692] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 722.141692] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] self.driver.spawn(context, instance, image_meta, [ 722.141692] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 722.141692] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] self._vmops.spawn(context, instance, image_meta, injected_files, [ 722.141692] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 722.141692] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] vm_ref = self.build_virtual_machine(instance, [ 722.141692] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 722.141692] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] vif_infos = vmwarevif.get_vif_info(self._session, [ 722.141692] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 722.141970] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] for vif in network_info: [ 722.141970] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 722.141970] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] return self._sync_wrapper(fn, *args, **kwargs) [ 722.141970] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 722.141970] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] self.wait() [ 722.141970] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 722.141970] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] self[:] = self._gt.wait() [ 722.141970] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 722.141970] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] return self._exit_event.wait() [ 722.141970] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 722.141970] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] result = hub.switch() [ 722.141970] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 722.141970] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] return self.greenlet.switch() [ 722.141970] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 722.142311] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] result = function(*args, **kwargs) [ 722.142311] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 722.142311] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] return func(*args, **kwargs) [ 722.142311] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 722.142311] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] raise e [ 722.142311] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 722.142311] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] nwinfo = self.network_api.allocate_for_instance( [ 722.142311] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 722.142311] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] created_port_ids = self._update_ports_for_instance( [ 722.142311] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 722.142311] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] with excutils.save_and_reraise_exception(): [ 722.142311] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 722.142311] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] self.force_reraise() [ 722.142614] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 722.142614] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] raise self.value [ 722.142614] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 722.142614] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] updated_port = self._update_port( [ 722.142614] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 722.142614] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] _ensure_no_port_binding_failure(port) [ 722.142614] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 722.142614] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] raise exception.PortBindingFailed(port_id=port['id']) [ 722.142614] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] nova.exception.PortBindingFailed: Binding failed for port 711f0237-f381-4494-9f23-9a9c2e51d498, please check neutron logs for more information. [ 722.142614] env[59659]: ERROR nova.compute.manager [instance: fdd34513-15af-4294-8a8a-e3b095188eda] [ 722.142614] env[59659]: DEBUG nova.compute.utils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Binding failed for port 711f0237-f381-4494-9f23-9a9c2e51d498, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 722.144268] env[59659]: DEBUG nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Build of instance fdd34513-15af-4294-8a8a-e3b095188eda was re-scheduled: Binding failed for port 711f0237-f381-4494-9f23-9a9c2e51d498, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 722.144663] env[59659]: DEBUG nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 722.144968] env[59659]: DEBUG oslo_concurrency.lockutils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Acquiring lock "refresh_cache-fdd34513-15af-4294-8a8a-e3b095188eda" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 722.145102] env[59659]: DEBUG oslo_concurrency.lockutils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Acquired lock "refresh_cache-fdd34513-15af-4294-8a8a-e3b095188eda" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 722.145310] env[59659]: DEBUG nova.network.neutron [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 722.205679] env[59659]: DEBUG nova.network.neutron [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 722.211365] env[59659]: DEBUG nova.policy [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '307ded03aa7142f1bb4b8134a4dbb3f8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cd4ace35259947bfa4d07c4c3f42d55b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 722.213323] env[59659]: DEBUG nova.network.neutron [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 722.393852] env[59659]: DEBUG nova.network.neutron [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Successfully created port: 5094a1db-1c30-434f-bb1a-00afbf387d0d {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 722.431824] env[59659]: DEBUG oslo_concurrency.lockutils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Acquiring lock "7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.432066] env[59659]: DEBUG oslo_concurrency.lockutils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Lock "7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 722.448573] env[59659]: DEBUG nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 722.477171] env[59659]: DEBUG nova.network.neutron [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.489088] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Releasing lock "refresh_cache-62ade33c-5283-432d-872c-cc162254317d" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 722.489088] env[59659]: DEBUG nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 722.489088] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 722.489088] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f12c7ad7-86dc-40e9-85e5-4181b3f2a4c4 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.497490] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1c4ce39-641c-4ac6-9757-b23d1d9db212 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.508340] env[59659]: DEBUG oslo_concurrency.lockutils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.508556] env[59659]: DEBUG oslo_concurrency.lockutils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 722.510093] env[59659]: INFO nova.compute.claims [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 722.525665] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 62ade33c-5283-432d-872c-cc162254317d could not be found. [ 722.525825] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 722.526010] env[59659]: INFO nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 722.526255] env[59659]: DEBUG oslo.service.loopingcall [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 722.526467] env[59659]: DEBUG nova.compute.manager [-] [instance: 62ade33c-5283-432d-872c-cc162254317d] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 722.526559] env[59659]: DEBUG nova.network.neutron [-] [instance: 62ade33c-5283-432d-872c-cc162254317d] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 722.582969] env[59659]: DEBUG nova.network.neutron [-] [instance: 62ade33c-5283-432d-872c-cc162254317d] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 722.590487] env[59659]: DEBUG nova.network.neutron [-] [instance: 62ade33c-5283-432d-872c-cc162254317d] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.598025] env[59659]: INFO nova.compute.manager [-] [instance: 62ade33c-5283-432d-872c-cc162254317d] Took 0.07 seconds to deallocate network for instance. [ 722.602587] env[59659]: DEBUG nova.compute.claims [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 722.602760] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.617424] env[59659]: DEBUG nova.network.neutron [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.625852] env[59659]: DEBUG oslo_concurrency.lockutils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Releasing lock "refresh_cache-fdd34513-15af-4294-8a8a-e3b095188eda" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 722.626395] env[59659]: DEBUG nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 722.626395] env[59659]: DEBUG nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 722.626395] env[59659]: DEBUG nova.network.neutron [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 722.670546] env[59659]: DEBUG nova.network.neutron [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 722.678722] env[59659]: DEBUG nova.network.neutron [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.686382] env[59659]: INFO nova.compute.manager [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] [instance: fdd34513-15af-4294-8a8a-e3b095188eda] Took 0.06 seconds to deallocate network for instance. [ 722.701689] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-812431d4-301b-47d8-b21c-ae918f2f60e0 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.710037] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-788a41c9-1a9f-4b51-9d86-7a6cc9eed7d6 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.745512] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4208bf3e-79d7-4161-aa27-7d112aebbbfc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.754215] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2aadea5c-ff9f-4490-bc59-f0cadbd4f63c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.769568] env[59659]: DEBUG nova.compute.provider_tree [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 722.785802] env[59659]: DEBUG nova.scheduler.client.report [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 722.798245] env[59659]: DEBUG oslo_concurrency.lockutils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.290s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 722.798703] env[59659]: DEBUG nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 722.801047] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.198s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 722.813013] env[59659]: INFO nova.scheduler.client.report [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Deleted allocations for instance fdd34513-15af-4294-8a8a-e3b095188eda [ 722.835749] env[59659]: DEBUG oslo_concurrency.lockutils [None req-fb650bf2-8157-43bb-9fcc-30774c4f895c tempest-SecurityGroupsTestJSON-536329627 tempest-SecurityGroupsTestJSON-536329627-project-member] Lock "fdd34513-15af-4294-8a8a-e3b095188eda" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.377s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 722.853589] env[59659]: DEBUG nova.compute.utils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 722.854813] env[59659]: DEBUG nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 722.854892] env[59659]: DEBUG nova.network.neutron [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 722.865080] env[59659]: DEBUG nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 722.934319] env[59659]: DEBUG nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 722.952622] env[59659]: DEBUG nova.network.neutron [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.963367] env[59659]: DEBUG nova.virt.hardware [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 722.963737] env[59659]: DEBUG nova.virt.hardware [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 722.963826] env[59659]: DEBUG nova.virt.hardware [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 722.964060] env[59659]: DEBUG nova.virt.hardware [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 722.964150] env[59659]: DEBUG nova.virt.hardware [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 722.964289] env[59659]: DEBUG nova.virt.hardware [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 722.964522] env[59659]: DEBUG nova.virt.hardware [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 722.964647] env[59659]: DEBUG nova.virt.hardware [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 722.964793] env[59659]: DEBUG nova.virt.hardware [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 722.964947] env[59659]: DEBUG nova.virt.hardware [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 722.965129] env[59659]: DEBUG nova.virt.hardware [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 722.966213] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a71041df-6e1d-4559-a80e-04ecc45fb081 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.971865] env[59659]: DEBUG oslo_concurrency.lockutils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Releasing lock "refresh_cache-9759f284-26e2-466e-9504-ffb63a359f27" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 722.972249] env[59659]: DEBUG nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 722.972414] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 722.973469] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ff91c73f-ec64-4d9e-90a2-978c5f7c7a50 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.978622] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ce4beb7-a47b-4dc4-8b9e-f93c8785fdff {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.984718] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e463314e-4ecc-417e-a5a7-411ac1f61f7c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.990041] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c78c96b-e763-4ab5-ac33-8915c303b6bc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.012436] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5cabe09-fd0a-4708-9613-41ddd3d486e1 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.049127] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4005836-92aa-4991-a567-a17abd7779ed {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.052067] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9759f284-26e2-466e-9504-ffb63a359f27 could not be found. [ 723.052280] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 723.052453] env[59659]: INFO nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Took 0.08 seconds to destroy the instance on the hypervisor. [ 723.052683] env[59659]: DEBUG oslo.service.loopingcall [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 723.052907] env[59659]: DEBUG nova.compute.manager [-] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 723.052992] env[59659]: DEBUG nova.network.neutron [-] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 723.059324] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adb60867-f798-4ac5-b85e-e8de443906a3 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.072669] env[59659]: DEBUG nova.compute.provider_tree [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 723.092029] env[59659]: DEBUG nova.scheduler.client.report [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 723.112390] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.311s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.125009] env[59659]: ERROR nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 88dc529c-e6b7-4d6e-b7a7-17e32a9977fd, please check neutron logs for more information. [ 723.125009] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] Traceback (most recent call last): [ 723.125009] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 723.125009] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] self.driver.spawn(context, instance, image_meta, [ 723.125009] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 723.125009] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 723.125009] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 723.125009] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] vm_ref = self.build_virtual_machine(instance, [ 723.125009] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 723.125009] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] vif_infos = vmwarevif.get_vif_info(self._session, [ 723.125009] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 723.125483] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] for vif in network_info: [ 723.125483] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 723.125483] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] return self._sync_wrapper(fn, *args, **kwargs) [ 723.125483] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 723.125483] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] self.wait() [ 723.125483] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 723.125483] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] self[:] = self._gt.wait() [ 723.125483] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 723.125483] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] return self._exit_event.wait() [ 723.125483] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 723.125483] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] result = hub.switch() [ 723.125483] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 723.125483] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] return self.greenlet.switch() [ 723.125483] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 723.126549] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] result = function(*args, **kwargs) [ 723.126549] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 723.126549] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] return func(*args, **kwargs) [ 723.126549] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 723.126549] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] raise e [ 723.126549] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 723.126549] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] nwinfo = self.network_api.allocate_for_instance( [ 723.126549] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 723.126549] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] created_port_ids = self._update_ports_for_instance( [ 723.126549] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 723.126549] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] with excutils.save_and_reraise_exception(): [ 723.126549] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 723.126549] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] self.force_reraise() [ 723.127043] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 723.127043] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] raise self.value [ 723.127043] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 723.127043] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] updated_port = self._update_port( [ 723.127043] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 723.127043] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] _ensure_no_port_binding_failure(port) [ 723.127043] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 723.127043] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] raise exception.PortBindingFailed(port_id=port['id']) [ 723.127043] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] nova.exception.PortBindingFailed: Binding failed for port 88dc529c-e6b7-4d6e-b7a7-17e32a9977fd, please check neutron logs for more information. [ 723.127043] env[59659]: ERROR nova.compute.manager [instance: 62ade33c-5283-432d-872c-cc162254317d] [ 723.127043] env[59659]: DEBUG nova.compute.utils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Binding failed for port 88dc529c-e6b7-4d6e-b7a7-17e32a9977fd, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 723.127423] env[59659]: DEBUG nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Build of instance 62ade33c-5283-432d-872c-cc162254317d was re-scheduled: Binding failed for port 88dc529c-e6b7-4d6e-b7a7-17e32a9977fd, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 723.127423] env[59659]: DEBUG nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 723.127423] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Acquiring lock "refresh_cache-62ade33c-5283-432d-872c-cc162254317d" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 723.127423] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Acquired lock "refresh_cache-62ade33c-5283-432d-872c-cc162254317d" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 723.127624] env[59659]: DEBUG nova.network.neutron [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 723.151604] env[59659]: DEBUG nova.policy [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b8d25c6dcda1421b82c920c9580bf020', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f858ee4b23fb49c399f103c4a8bcdebc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 723.153933] env[59659]: DEBUG nova.network.neutron [-] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 723.165877] env[59659]: DEBUG nova.network.neutron [-] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.176019] env[59659]: INFO nova.compute.manager [-] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Took 0.12 seconds to deallocate network for instance. [ 723.177426] env[59659]: DEBUG nova.compute.claims [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 723.177591] env[59659]: DEBUG oslo_concurrency.lockutils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.177789] env[59659]: DEBUG oslo_concurrency.lockutils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.195201] env[59659]: DEBUG nova.network.neutron [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 723.349173] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5fbc110-cde9-4ecd-abd7-dd7d95d7ff12 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.357806] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7089c7ce-1c6c-43c1-b42e-fbc5d2206b0b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.389045] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-baa694db-5500-46f6-b8dd-e8c48337a730 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.397115] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a86274c8-1ffc-4c6b-880b-e5f1f3c94933 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.410453] env[59659]: DEBUG nova.compute.provider_tree [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 723.418902] env[59659]: DEBUG nova.scheduler.client.report [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 723.436523] env[59659]: DEBUG oslo_concurrency.lockutils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.259s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.437919] env[59659]: ERROR nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port f96d3967-9568-40ee-9bd9-a29f08464a46, please check neutron logs for more information. [ 723.437919] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Traceback (most recent call last): [ 723.437919] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 723.437919] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] self.driver.spawn(context, instance, image_meta, [ 723.437919] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 723.437919] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] self._vmops.spawn(context, instance, image_meta, injected_files, [ 723.437919] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 723.437919] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] vm_ref = self.build_virtual_machine(instance, [ 723.437919] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 723.437919] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] vif_infos = vmwarevif.get_vif_info(self._session, [ 723.437919] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 723.438643] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] for vif in network_info: [ 723.438643] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 723.438643] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] return self._sync_wrapper(fn, *args, **kwargs) [ 723.438643] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 723.438643] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] self.wait() [ 723.438643] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 723.438643] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] self[:] = self._gt.wait() [ 723.438643] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 723.438643] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] return self._exit_event.wait() [ 723.438643] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 723.438643] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] result = hub.switch() [ 723.438643] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 723.438643] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] return self.greenlet.switch() [ 723.438643] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 723.439280] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] result = function(*args, **kwargs) [ 723.439280] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 723.439280] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] return func(*args, **kwargs) [ 723.439280] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 723.439280] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] raise e [ 723.439280] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 723.439280] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] nwinfo = self.network_api.allocate_for_instance( [ 723.439280] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 723.439280] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] created_port_ids = self._update_ports_for_instance( [ 723.439280] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 723.439280] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] with excutils.save_and_reraise_exception(): [ 723.439280] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 723.439280] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] self.force_reraise() [ 723.439727] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 723.439727] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] raise self.value [ 723.439727] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 723.439727] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] updated_port = self._update_port( [ 723.439727] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 723.439727] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] _ensure_no_port_binding_failure(port) [ 723.439727] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 723.439727] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] raise exception.PortBindingFailed(port_id=port['id']) [ 723.439727] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] nova.exception.PortBindingFailed: Binding failed for port f96d3967-9568-40ee-9bd9-a29f08464a46, please check neutron logs for more information. [ 723.439727] env[59659]: ERROR nova.compute.manager [instance: 9759f284-26e2-466e-9504-ffb63a359f27] [ 723.440076] env[59659]: DEBUG nova.compute.utils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Binding failed for port f96d3967-9568-40ee-9bd9-a29f08464a46, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 723.440457] env[59659]: DEBUG nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Build of instance 9759f284-26e2-466e-9504-ffb63a359f27 was re-scheduled: Binding failed for port f96d3967-9568-40ee-9bd9-a29f08464a46, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 723.441231] env[59659]: DEBUG nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 723.441588] env[59659]: DEBUG oslo_concurrency.lockutils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Acquiring lock "refresh_cache-9759f284-26e2-466e-9504-ffb63a359f27" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 723.441745] env[59659]: DEBUG oslo_concurrency.lockutils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Acquired lock "refresh_cache-9759f284-26e2-466e-9504-ffb63a359f27" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 723.442127] env[59659]: DEBUG nova.network.neutron [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 723.553646] env[59659]: DEBUG nova.network.neutron [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 723.580062] env[59659]: DEBUG nova.network.neutron [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.588585] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Releasing lock "refresh_cache-62ade33c-5283-432d-872c-cc162254317d" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 723.588810] env[59659]: DEBUG nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 723.588986] env[59659]: DEBUG nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 723.589161] env[59659]: DEBUG nova.network.neutron [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 723.622192] env[59659]: DEBUG nova.network.neutron [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 723.628616] env[59659]: DEBUG nova.network.neutron [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.639065] env[59659]: INFO nova.compute.manager [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] [instance: 62ade33c-5283-432d-872c-cc162254317d] Took 0.05 seconds to deallocate network for instance. [ 723.729396] env[59659]: INFO nova.scheduler.client.report [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Deleted allocations for instance 62ade33c-5283-432d-872c-cc162254317d [ 723.745601] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8d1248a1-05de-4a70-8c0b-9763d4c8c4e3 tempest-VolumesAdminNegativeTest-116106725 tempest-VolumesAdminNegativeTest-116106725-project-member] Lock "62ade33c-5283-432d-872c-cc162254317d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.116s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.266734] env[59659]: DEBUG nova.network.neutron [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 724.275685] env[59659]: DEBUG oslo_concurrency.lockutils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Releasing lock "refresh_cache-9759f284-26e2-466e-9504-ffb63a359f27" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 724.276053] env[59659]: DEBUG nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 724.276146] env[59659]: DEBUG nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 724.276235] env[59659]: DEBUG nova.network.neutron [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 724.388512] env[59659]: DEBUG nova.network.neutron [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 724.398181] env[59659]: DEBUG nova.network.neutron [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 724.408493] env[59659]: INFO nova.compute.manager [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] [instance: 9759f284-26e2-466e-9504-ffb63a359f27] Took 0.13 seconds to deallocate network for instance. [ 724.428091] env[59659]: DEBUG nova.network.neutron [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Successfully created port: dbbd873b-748b-4380-b79b-0889fea0b6d1 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 724.544962] env[59659]: INFO nova.scheduler.client.report [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Deleted allocations for instance 9759f284-26e2-466e-9504-ffb63a359f27 [ 724.567303] env[59659]: DEBUG oslo_concurrency.lockutils [None req-04894b72-62f4-42ce-bcd9-93286e30827e tempest-ServersAdminNegativeTestJSON-370113334 tempest-ServersAdminNegativeTestJSON-370113334-project-member] Lock "9759f284-26e2-466e-9504-ffb63a359f27" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.728s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.773107] env[59659]: DEBUG oslo_concurrency.lockutils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquiring lock "a411c5e7-5a49-463e-b270-800e35a31188" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.773107] env[59659]: DEBUG oslo_concurrency.lockutils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "a411c5e7-5a49-463e-b270-800e35a31188" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.783098] env[59659]: DEBUG nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 724.836549] env[59659]: DEBUG oslo_concurrency.lockutils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.836798] env[59659]: DEBUG oslo_concurrency.lockutils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.838588] env[59659]: INFO nova.compute.claims [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 725.023589] env[59659]: DEBUG nova.network.neutron [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Successfully created port: 346f3b4a-8fe0-46e9-8eb3-97e76ce72c7f {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 725.039812] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d522060-0e1e-49d1-b4bd-b17ff9b1012c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.049170] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0eef47ea-bfb7-49e9-8266-4ecb61eb7db7 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.084151] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-006192ce-3952-40f4-be0c-5d5d04cee913 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.091219] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-299e0874-54d0-4d1e-9e15-500250537a9d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.109565] env[59659]: DEBUG nova.compute.provider_tree [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 725.121040] env[59659]: DEBUG nova.scheduler.client.report [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 725.142124] env[59659]: DEBUG oslo_concurrency.lockutils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.305s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 725.142124] env[59659]: DEBUG nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 725.198903] env[59659]: DEBUG nova.compute.utils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 725.200257] env[59659]: DEBUG nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 725.200588] env[59659]: DEBUG nova.network.neutron [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 725.211599] env[59659]: DEBUG nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 725.288854] env[59659]: DEBUG nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 725.313609] env[59659]: DEBUG nova.virt.hardware [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 725.314727] env[59659]: DEBUG nova.virt.hardware [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 725.314935] env[59659]: DEBUG nova.virt.hardware [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 725.315184] env[59659]: DEBUG nova.virt.hardware [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 725.315899] env[59659]: DEBUG nova.virt.hardware [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 725.316111] env[59659]: DEBUG nova.virt.hardware [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 725.316361] env[59659]: DEBUG nova.virt.hardware [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 725.316550] env[59659]: DEBUG nova.virt.hardware [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 725.316743] env[59659]: DEBUG nova.virt.hardware [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 725.316932] env[59659]: DEBUG nova.virt.hardware [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 725.317184] env[59659]: DEBUG nova.virt.hardware [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 725.318267] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bfa0ba3-94dc-4dce-becd-d888bd00135e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.327976] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de4c685e-5cda-4901-9272-32bacb616238 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.581030] env[59659]: DEBUG nova.policy [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '593629524d524ad9b515d92b36e7b1e8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '91232108c8944a3da00233e9c54c9749', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 726.949934] env[59659]: ERROR nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 5094a1db-1c30-434f-bb1a-00afbf387d0d, please check neutron logs for more information. [ 726.949934] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 726.949934] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 726.949934] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 726.949934] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 726.949934] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 726.949934] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 726.949934] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 726.949934] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.949934] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 726.949934] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.949934] env[59659]: ERROR nova.compute.manager raise self.value [ 726.949934] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 726.949934] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 726.949934] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.949934] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 726.955536] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.955536] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 726.955536] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 5094a1db-1c30-434f-bb1a-00afbf387d0d, please check neutron logs for more information. [ 726.955536] env[59659]: ERROR nova.compute.manager [ 726.955536] env[59659]: Traceback (most recent call last): [ 726.955536] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 726.955536] env[59659]: listener.cb(fileno) [ 726.955536] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 726.955536] env[59659]: result = function(*args, **kwargs) [ 726.955536] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 726.955536] env[59659]: return func(*args, **kwargs) [ 726.955536] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 726.955536] env[59659]: raise e [ 726.955536] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 726.955536] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 726.955536] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 726.955536] env[59659]: created_port_ids = self._update_ports_for_instance( [ 726.955536] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 726.955536] env[59659]: with excutils.save_and_reraise_exception(): [ 726.955536] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.955536] env[59659]: self.force_reraise() [ 726.955536] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.955536] env[59659]: raise self.value [ 726.955536] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 726.955536] env[59659]: updated_port = self._update_port( [ 726.955536] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.955536] env[59659]: _ensure_no_port_binding_failure(port) [ 726.955536] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.955536] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 726.957172] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 5094a1db-1c30-434f-bb1a-00afbf387d0d, please check neutron logs for more information. [ 726.957172] env[59659]: Removing descriptor: 22 [ 726.957172] env[59659]: ERROR nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 5094a1db-1c30-434f-bb1a-00afbf387d0d, please check neutron logs for more information. [ 726.957172] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Traceback (most recent call last): [ 726.957172] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 726.957172] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] yield resources [ 726.957172] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 726.957172] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] self.driver.spawn(context, instance, image_meta, [ 726.957172] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 726.957172] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] self._vmops.spawn(context, instance, image_meta, injected_files, [ 726.957172] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 726.957172] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] vm_ref = self.build_virtual_machine(instance, [ 726.957763] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 726.957763] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] vif_infos = vmwarevif.get_vif_info(self._session, [ 726.957763] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 726.957763] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] for vif in network_info: [ 726.957763] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 726.957763] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] return self._sync_wrapper(fn, *args, **kwargs) [ 726.957763] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 726.957763] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] self.wait() [ 726.957763] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 726.957763] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] self[:] = self._gt.wait() [ 726.957763] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 726.957763] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] return self._exit_event.wait() [ 726.957763] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 726.957763] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] result = hub.switch() [ 726.958643] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 726.958643] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] return self.greenlet.switch() [ 726.958643] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 726.958643] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] result = function(*args, **kwargs) [ 726.958643] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 726.958643] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] return func(*args, **kwargs) [ 726.958643] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 726.958643] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] raise e [ 726.958643] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 726.958643] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] nwinfo = self.network_api.allocate_for_instance( [ 726.958643] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 726.958643] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] created_port_ids = self._update_ports_for_instance( [ 726.958643] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 726.958966] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] with excutils.save_and_reraise_exception(): [ 726.958966] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 726.958966] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] self.force_reraise() [ 726.958966] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 726.958966] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] raise self.value [ 726.958966] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 726.958966] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] updated_port = self._update_port( [ 726.958966] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 726.958966] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] _ensure_no_port_binding_failure(port) [ 726.958966] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 726.958966] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] raise exception.PortBindingFailed(port_id=port['id']) [ 726.958966] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] nova.exception.PortBindingFailed: Binding failed for port 5094a1db-1c30-434f-bb1a-00afbf387d0d, please check neutron logs for more information. [ 726.958966] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] [ 726.959297] env[59659]: INFO nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Terminating instance [ 726.959297] env[59659]: DEBUG oslo_concurrency.lockutils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Acquiring lock "refresh_cache-1f7f6276-cfe5-4427-90b6-893e7ad6cffe" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 726.959297] env[59659]: DEBUG oslo_concurrency.lockutils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Acquired lock "refresh_cache-1f7f6276-cfe5-4427-90b6-893e7ad6cffe" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 726.959297] env[59659]: DEBUG nova.network.neutron [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 727.001404] env[59659]: DEBUG nova.network.neutron [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 727.194916] env[59659]: ERROR nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 808c0d62-376a-4bfe-a50c-92a1fa8874f0, please check neutron logs for more information. [ 727.194916] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 727.194916] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 727.194916] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 727.194916] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.194916] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 727.194916] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.194916] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 727.194916] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.194916] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 727.194916] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.194916] env[59659]: ERROR nova.compute.manager raise self.value [ 727.194916] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.194916] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 727.194916] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.194916] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 727.195489] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.195489] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 727.195489] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 808c0d62-376a-4bfe-a50c-92a1fa8874f0, please check neutron logs for more information. [ 727.195489] env[59659]: ERROR nova.compute.manager [ 727.195489] env[59659]: Traceback (most recent call last): [ 727.195489] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 727.195489] env[59659]: listener.cb(fileno) [ 727.195489] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 727.195489] env[59659]: result = function(*args, **kwargs) [ 727.195489] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 727.195489] env[59659]: return func(*args, **kwargs) [ 727.195489] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 727.195489] env[59659]: raise e [ 727.195489] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 727.195489] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 727.195489] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.195489] env[59659]: created_port_ids = self._update_ports_for_instance( [ 727.195489] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.195489] env[59659]: with excutils.save_and_reraise_exception(): [ 727.195489] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.195489] env[59659]: self.force_reraise() [ 727.195489] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.195489] env[59659]: raise self.value [ 727.195489] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.195489] env[59659]: updated_port = self._update_port( [ 727.195489] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.195489] env[59659]: _ensure_no_port_binding_failure(port) [ 727.195489] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.195489] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 727.196292] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 808c0d62-376a-4bfe-a50c-92a1fa8874f0, please check neutron logs for more information. [ 727.196292] env[59659]: Removing descriptor: 15 [ 727.196292] env[59659]: ERROR nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 808c0d62-376a-4bfe-a50c-92a1fa8874f0, please check neutron logs for more information. [ 727.196292] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Traceback (most recent call last): [ 727.196292] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 727.196292] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] yield resources [ 727.196292] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 727.196292] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] self.driver.spawn(context, instance, image_meta, [ 727.196292] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 727.196292] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 727.196292] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 727.196292] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] vm_ref = self.build_virtual_machine(instance, [ 727.196625] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 727.196625] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] vif_infos = vmwarevif.get_vif_info(self._session, [ 727.196625] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 727.196625] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] for vif in network_info: [ 727.196625] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 727.196625] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] return self._sync_wrapper(fn, *args, **kwargs) [ 727.196625] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 727.196625] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] self.wait() [ 727.196625] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 727.196625] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] self[:] = self._gt.wait() [ 727.196625] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 727.196625] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] return self._exit_event.wait() [ 727.196625] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 727.196625] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] result = hub.switch() [ 727.197015] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 727.197015] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] return self.greenlet.switch() [ 727.197015] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 727.197015] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] result = function(*args, **kwargs) [ 727.197015] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 727.197015] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] return func(*args, **kwargs) [ 727.197015] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 727.197015] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] raise e [ 727.197015] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 727.197015] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] nwinfo = self.network_api.allocate_for_instance( [ 727.197015] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 727.197015] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] created_port_ids = self._update_ports_for_instance( [ 727.197015] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 727.197422] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] with excutils.save_and_reraise_exception(): [ 727.197422] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 727.197422] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] self.force_reraise() [ 727.197422] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 727.197422] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] raise self.value [ 727.197422] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 727.197422] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] updated_port = self._update_port( [ 727.197422] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 727.197422] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] _ensure_no_port_binding_failure(port) [ 727.197422] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 727.197422] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] raise exception.PortBindingFailed(port_id=port['id']) [ 727.197422] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] nova.exception.PortBindingFailed: Binding failed for port 808c0d62-376a-4bfe-a50c-92a1fa8874f0, please check neutron logs for more information. [ 727.197422] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] [ 727.197724] env[59659]: INFO nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Terminating instance [ 727.198388] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Acquiring lock "refresh_cache-ce3bd633-4538-428d-9258-9222c3c72edd" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 727.198540] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Acquired lock "refresh_cache-ce3bd633-4538-428d-9258-9222c3c72edd" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 727.198699] env[59659]: DEBUG nova.network.neutron [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 727.273173] env[59659]: DEBUG nova.network.neutron [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 727.572931] env[59659]: DEBUG nova.network.neutron [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 727.583058] env[59659]: DEBUG oslo_concurrency.lockutils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Releasing lock "refresh_cache-1f7f6276-cfe5-4427-90b6-893e7ad6cffe" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 727.583631] env[59659]: DEBUG nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 727.584553] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4f8fb9f8-204a-459b-a177-595a6d0e4baf {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.595086] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-837285eb-52ff-4845-87de-87d1378c8310 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.618751] env[59659]: WARNING nova.virt.vmwareapi.driver [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Instance does not exists. Proceeding to delete instance properties on datastore: nova.exception.InstanceNotFound: Instance 1f7f6276-cfe5-4427-90b6-893e7ad6cffe could not be found. [ 727.618923] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 727.619272] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d93514e9-c205-4c60-879a-0f09c78d8c36 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.628581] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f095afe6-6e1e-4469-8975-e8cac1fdf46a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.658756] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1f7f6276-cfe5-4427-90b6-893e7ad6cffe could not be found. [ 727.659040] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 727.659227] env[59659]: INFO nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Took 0.08 seconds to destroy the instance on the hypervisor. [ 727.659510] env[59659]: DEBUG oslo.service.loopingcall [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 727.659733] env[59659]: DEBUG nova.compute.manager [-] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 727.659827] env[59659]: DEBUG nova.network.neutron [-] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 727.708454] env[59659]: DEBUG nova.network.neutron [-] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 727.717725] env[59659]: DEBUG nova.network.neutron [-] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 727.731657] env[59659]: INFO nova.compute.manager [-] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Took 0.07 seconds to deallocate network for instance. [ 727.810088] env[59659]: INFO nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Took 0.08 seconds to detach 1 volumes for instance. [ 727.814080] env[59659]: DEBUG nova.compute.claims [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 727.814388] env[59659]: DEBUG oslo_concurrency.lockutils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.814801] env[59659]: DEBUG oslo_concurrency.lockutils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.999429] env[59659]: DEBUG nova.network.neutron [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Successfully created port: a63c96c0-e7e4-4bf2-9ea3-7ce0cbf6e40f {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 728.009902] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3253522-b282-4b9b-96a2-7aaeb4121386 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.017826] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f45e0bc4-34bc-49e7-a499-dec57b49e632 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.049606] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35ccb59e-9ecf-4832-973f-cc7f66a8c443 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.057950] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85196644-1e3d-4d19-9768-887aab990e00 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.070588] env[59659]: DEBUG nova.compute.provider_tree [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 728.084785] env[59659]: DEBUG nova.scheduler.client.report [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 728.098774] env[59659]: DEBUG oslo_concurrency.lockutils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.284s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 728.099709] env[59659]: ERROR nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 5094a1db-1c30-434f-bb1a-00afbf387d0d, please check neutron logs for more information. [ 728.099709] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Traceback (most recent call last): [ 728.099709] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 728.099709] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] self.driver.spawn(context, instance, image_meta, [ 728.099709] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 728.099709] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] self._vmops.spawn(context, instance, image_meta, injected_files, [ 728.099709] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 728.099709] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] vm_ref = self.build_virtual_machine(instance, [ 728.099709] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 728.099709] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] vif_infos = vmwarevif.get_vif_info(self._session, [ 728.099709] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 728.100118] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] for vif in network_info: [ 728.100118] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 728.100118] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] return self._sync_wrapper(fn, *args, **kwargs) [ 728.100118] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 728.100118] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] self.wait() [ 728.100118] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 728.100118] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] self[:] = self._gt.wait() [ 728.100118] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 728.100118] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] return self._exit_event.wait() [ 728.100118] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 728.100118] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] result = hub.switch() [ 728.100118] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 728.100118] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] return self.greenlet.switch() [ 728.100118] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 728.100447] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] result = function(*args, **kwargs) [ 728.100447] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 728.100447] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] return func(*args, **kwargs) [ 728.100447] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 728.100447] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] raise e [ 728.100447] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 728.100447] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] nwinfo = self.network_api.allocate_for_instance( [ 728.100447] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 728.100447] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] created_port_ids = self._update_ports_for_instance( [ 728.100447] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 728.100447] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] with excutils.save_and_reraise_exception(): [ 728.100447] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 728.100447] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] self.force_reraise() [ 728.100758] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 728.100758] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] raise self.value [ 728.100758] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 728.100758] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] updated_port = self._update_port( [ 728.100758] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 728.100758] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] _ensure_no_port_binding_failure(port) [ 728.100758] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 728.100758] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] raise exception.PortBindingFailed(port_id=port['id']) [ 728.100758] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] nova.exception.PortBindingFailed: Binding failed for port 5094a1db-1c30-434f-bb1a-00afbf387d0d, please check neutron logs for more information. [ 728.100758] env[59659]: ERROR nova.compute.manager [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] [ 728.100758] env[59659]: DEBUG nova.compute.utils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Binding failed for port 5094a1db-1c30-434f-bb1a-00afbf387d0d, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 728.101985] env[59659]: DEBUG nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Build of instance 1f7f6276-cfe5-4427-90b6-893e7ad6cffe was re-scheduled: Binding failed for port 5094a1db-1c30-434f-bb1a-00afbf387d0d, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 728.102402] env[59659]: DEBUG nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 728.102621] env[59659]: DEBUG oslo_concurrency.lockutils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Acquiring lock "refresh_cache-1f7f6276-cfe5-4427-90b6-893e7ad6cffe" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 728.102757] env[59659]: DEBUG oslo_concurrency.lockutils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Acquired lock "refresh_cache-1f7f6276-cfe5-4427-90b6-893e7ad6cffe" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 728.102909] env[59659]: DEBUG nova.network.neutron [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 728.145042] env[59659]: DEBUG nova.network.neutron [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.184040] env[59659]: DEBUG nova.network.neutron [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.191152] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Releasing lock "refresh_cache-ce3bd633-4538-428d-9258-9222c3c72edd" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 728.191562] env[59659]: DEBUG nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 728.191750] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 728.192279] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-af50f42a-8ebe-453a-9b29-93d2d8a23486 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.204170] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6f78c9b-0e20-4be3-b8ea-6c186bda5b03 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.232874] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ce3bd633-4538-428d-9258-9222c3c72edd could not be found. [ 728.232874] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 728.232874] env[59659]: INFO nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Took 0.04 seconds to destroy the instance on the hypervisor. [ 728.232874] env[59659]: DEBUG oslo.service.loopingcall [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 728.232874] env[59659]: DEBUG nova.compute.manager [-] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 728.233266] env[59659]: DEBUG nova.network.neutron [-] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 728.337644] env[59659]: DEBUG nova.network.neutron [-] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.347960] env[59659]: DEBUG nova.network.neutron [-] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.362215] env[59659]: INFO nova.compute.manager [-] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Took 0.13 seconds to deallocate network for instance. [ 728.362215] env[59659]: DEBUG nova.compute.claims [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 728.362215] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 728.362215] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 728.388397] env[59659]: DEBUG nova.network.neutron [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.400374] env[59659]: DEBUG oslo_concurrency.lockutils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Releasing lock "refresh_cache-1f7f6276-cfe5-4427-90b6-893e7ad6cffe" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 728.400590] env[59659]: DEBUG nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 728.400857] env[59659]: DEBUG nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 728.400919] env[59659]: DEBUG nova.network.neutron [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 728.430827] env[59659]: DEBUG nova.network.neutron [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.442987] env[59659]: DEBUG nova.network.neutron [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.455084] env[59659]: INFO nova.compute.manager [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] [instance: 1f7f6276-cfe5-4427-90b6-893e7ad6cffe] Took 0.05 seconds to deallocate network for instance. [ 728.501739] env[59659]: ERROR nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port dbbd873b-748b-4380-b79b-0889fea0b6d1, please check neutron logs for more information. [ 728.501739] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 728.501739] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 728.501739] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 728.501739] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 728.501739] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 728.501739] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 728.501739] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 728.501739] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 728.501739] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 728.501739] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 728.501739] env[59659]: ERROR nova.compute.manager raise self.value [ 728.501739] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 728.501739] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 728.501739] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 728.501739] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 728.502480] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 728.502480] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 728.502480] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port dbbd873b-748b-4380-b79b-0889fea0b6d1, please check neutron logs for more information. [ 728.502480] env[59659]: ERROR nova.compute.manager [ 728.502480] env[59659]: Traceback (most recent call last): [ 728.502480] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 728.502480] env[59659]: listener.cb(fileno) [ 728.502480] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 728.502480] env[59659]: result = function(*args, **kwargs) [ 728.502480] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 728.502480] env[59659]: return func(*args, **kwargs) [ 728.502480] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 728.502480] env[59659]: raise e [ 728.502480] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 728.502480] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 728.502480] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 728.502480] env[59659]: created_port_ids = self._update_ports_for_instance( [ 728.502480] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 728.502480] env[59659]: with excutils.save_and_reraise_exception(): [ 728.502480] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 728.502480] env[59659]: self.force_reraise() [ 728.502480] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 728.502480] env[59659]: raise self.value [ 728.502480] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 728.502480] env[59659]: updated_port = self._update_port( [ 728.502480] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 728.502480] env[59659]: _ensure_no_port_binding_failure(port) [ 728.502480] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 728.502480] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 728.505309] env[59659]: nova.exception.PortBindingFailed: Binding failed for port dbbd873b-748b-4380-b79b-0889fea0b6d1, please check neutron logs for more information. [ 728.505309] env[59659]: Removing descriptor: 12 [ 728.505309] env[59659]: ERROR nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port dbbd873b-748b-4380-b79b-0889fea0b6d1, please check neutron logs for more information. [ 728.505309] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Traceback (most recent call last): [ 728.505309] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 728.505309] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] yield resources [ 728.505309] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 728.505309] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] self.driver.spawn(context, instance, image_meta, [ 728.505309] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 728.505309] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] self._vmops.spawn(context, instance, image_meta, injected_files, [ 728.505309] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 728.505309] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] vm_ref = self.build_virtual_machine(instance, [ 728.506394] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 728.506394] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] vif_infos = vmwarevif.get_vif_info(self._session, [ 728.506394] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 728.506394] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] for vif in network_info: [ 728.506394] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 728.506394] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] return self._sync_wrapper(fn, *args, **kwargs) [ 728.506394] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 728.506394] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] self.wait() [ 728.506394] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 728.506394] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] self[:] = self._gt.wait() [ 728.506394] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 728.506394] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] return self._exit_event.wait() [ 728.506394] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 728.506394] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] result = hub.switch() [ 728.506861] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 728.506861] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] return self.greenlet.switch() [ 728.506861] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 728.506861] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] result = function(*args, **kwargs) [ 728.506861] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 728.506861] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] return func(*args, **kwargs) [ 728.506861] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 728.506861] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] raise e [ 728.506861] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 728.506861] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] nwinfo = self.network_api.allocate_for_instance( [ 728.506861] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 728.506861] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] created_port_ids = self._update_ports_for_instance( [ 728.506861] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 728.507347] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] with excutils.save_and_reraise_exception(): [ 728.507347] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 728.507347] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] self.force_reraise() [ 728.507347] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 728.507347] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] raise self.value [ 728.507347] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 728.507347] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] updated_port = self._update_port( [ 728.507347] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 728.507347] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] _ensure_no_port_binding_failure(port) [ 728.507347] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 728.507347] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] raise exception.PortBindingFailed(port_id=port['id']) [ 728.507347] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] nova.exception.PortBindingFailed: Binding failed for port dbbd873b-748b-4380-b79b-0889fea0b6d1, please check neutron logs for more information. [ 728.507347] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] [ 728.507826] env[59659]: INFO nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Terminating instance [ 728.511469] env[59659]: DEBUG oslo_concurrency.lockutils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Acquiring lock "refresh_cache-7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 728.511469] env[59659]: DEBUG oslo_concurrency.lockutils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Acquired lock "refresh_cache-7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 728.511469] env[59659]: DEBUG nova.network.neutron [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 728.547591] env[59659]: DEBUG nova.network.neutron [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.551434] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb84da48-41ea-4578-8a40-e088c4be2b4a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.563765] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1f313c5-c5a9-4d8d-a27c-129011e5a2ce {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.604909] env[59659]: INFO nova.scheduler.client.report [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Deleted allocations for instance 1f7f6276-cfe5-4427-90b6-893e7ad6cffe [ 728.611229] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ace8941-a9a6-4611-a2b1-6e047c908e48 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.621246] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0a2988a-7089-4eef-ace7-ab714462904a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.626206] env[59659]: DEBUG oslo_concurrency.lockutils [None req-0098a2d1-2506-4fba-9164-111089f7368c tempest-ServersTestBootFromVolume-1432150623 tempest-ServersTestBootFromVolume-1432150623-project-member] Lock "1f7f6276-cfe5-4427-90b6-893e7ad6cffe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.701s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 728.637664] env[59659]: DEBUG nova.compute.provider_tree [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 728.653319] env[59659]: DEBUG nova.scheduler.client.report [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 728.670097] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.308s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 728.670702] env[59659]: ERROR nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 808c0d62-376a-4bfe-a50c-92a1fa8874f0, please check neutron logs for more information. [ 728.670702] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Traceback (most recent call last): [ 728.670702] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 728.670702] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] self.driver.spawn(context, instance, image_meta, [ 728.670702] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 728.670702] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 728.670702] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 728.670702] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] vm_ref = self.build_virtual_machine(instance, [ 728.670702] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 728.670702] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] vif_infos = vmwarevif.get_vif_info(self._session, [ 728.670702] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 728.671011] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] for vif in network_info: [ 728.671011] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 728.671011] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] return self._sync_wrapper(fn, *args, **kwargs) [ 728.671011] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 728.671011] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] self.wait() [ 728.671011] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 728.671011] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] self[:] = self._gt.wait() [ 728.671011] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 728.671011] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] return self._exit_event.wait() [ 728.671011] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 728.671011] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] result = hub.switch() [ 728.671011] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 728.671011] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] return self.greenlet.switch() [ 728.671011] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 728.671354] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] result = function(*args, **kwargs) [ 728.671354] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 728.671354] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] return func(*args, **kwargs) [ 728.671354] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 728.671354] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] raise e [ 728.671354] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 728.671354] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] nwinfo = self.network_api.allocate_for_instance( [ 728.671354] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 728.671354] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] created_port_ids = self._update_ports_for_instance( [ 728.671354] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 728.671354] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] with excutils.save_and_reraise_exception(): [ 728.671354] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 728.671354] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] self.force_reraise() [ 728.671667] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 728.671667] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] raise self.value [ 728.671667] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 728.671667] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] updated_port = self._update_port( [ 728.671667] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 728.671667] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] _ensure_no_port_binding_failure(port) [ 728.671667] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 728.671667] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] raise exception.PortBindingFailed(port_id=port['id']) [ 728.671667] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] nova.exception.PortBindingFailed: Binding failed for port 808c0d62-376a-4bfe-a50c-92a1fa8874f0, please check neutron logs for more information. [ 728.671667] env[59659]: ERROR nova.compute.manager [instance: ce3bd633-4538-428d-9258-9222c3c72edd] [ 728.671920] env[59659]: DEBUG nova.compute.utils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Binding failed for port 808c0d62-376a-4bfe-a50c-92a1fa8874f0, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 728.673497] env[59659]: DEBUG nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Build of instance ce3bd633-4538-428d-9258-9222c3c72edd was re-scheduled: Binding failed for port 808c0d62-376a-4bfe-a50c-92a1fa8874f0, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 728.673916] env[59659]: DEBUG nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 728.674165] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Acquiring lock "refresh_cache-ce3bd633-4538-428d-9258-9222c3c72edd" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 728.674304] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Acquired lock "refresh_cache-ce3bd633-4538-428d-9258-9222c3c72edd" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 728.674454] env[59659]: DEBUG nova.network.neutron [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 728.718783] env[59659]: DEBUG nova.network.neutron [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.737836] env[59659]: DEBUG oslo_concurrency.lockutils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Acquiring lock "1670e7a3-656a-444d-85ed-292956498612" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 728.738234] env[59659]: DEBUG oslo_concurrency.lockutils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Lock "1670e7a3-656a-444d-85ed-292956498612" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 728.747160] env[59659]: DEBUG nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 728.804482] env[59659]: DEBUG oslo_concurrency.lockutils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 728.804715] env[59659]: DEBUG oslo_concurrency.lockutils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 728.806923] env[59659]: INFO nova.compute.claims [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 728.834280] env[59659]: DEBUG nova.network.neutron [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.843289] env[59659]: DEBUG oslo_concurrency.lockutils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Releasing lock "refresh_cache-7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 728.843674] env[59659]: DEBUG nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 728.843855] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 728.844391] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b16654f1-80c8-4a5b-b757-a8e48dd4d24f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.854017] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e7dccc8-1991-4461-98a8-10c41936eb35 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 728.884549] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998 could not be found. [ 728.884778] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 728.884957] env[59659]: INFO nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Took 0.04 seconds to destroy the instance on the hypervisor. [ 728.885380] env[59659]: DEBUG oslo.service.loopingcall [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 728.885440] env[59659]: DEBUG nova.compute.manager [-] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 728.885574] env[59659]: DEBUG nova.network.neutron [-] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 728.916834] env[59659]: DEBUG nova.network.neutron [-] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.927148] env[59659]: DEBUG nova.network.neutron [-] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 728.948302] env[59659]: INFO nova.compute.manager [-] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Took 0.06 seconds to deallocate network for instance. [ 728.950641] env[59659]: DEBUG nova.compute.claims [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 728.950641] env[59659]: DEBUG oslo_concurrency.lockutils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.041237] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb52013a-ec85-4695-afe7-651b10dc7c5d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.049505] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd3fa345-ac05-406c-8bd7-56bd5842d344 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.085027] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b1bcca9-f347-48fc-9c08-d5498cd09874 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.093087] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3671e60a-c06b-4ea6-8973-6322af095480 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.107191] env[59659]: DEBUG nova.compute.provider_tree [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 729.116615] env[59659]: DEBUG nova.scheduler.client.report [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 729.132547] env[59659]: DEBUG oslo_concurrency.lockutils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.328s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.133073] env[59659]: DEBUG nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 729.135701] env[59659]: DEBUG oslo_concurrency.lockutils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.185s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.167553] env[59659]: DEBUG nova.compute.utils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 729.169717] env[59659]: DEBUG nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 729.169933] env[59659]: DEBUG nova.network.neutron [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 729.175464] env[59659]: DEBUG nova.network.neutron [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.178560] env[59659]: DEBUG nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 729.198913] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Releasing lock "refresh_cache-ce3bd633-4538-428d-9258-9222c3c72edd" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 729.201035] env[59659]: DEBUG nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 729.201035] env[59659]: DEBUG nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 729.201035] env[59659]: DEBUG nova.network.neutron [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 729.235313] env[59659]: DEBUG nova.network.neutron [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 729.245264] env[59659]: DEBUG nova.network.neutron [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.257258] env[59659]: INFO nova.compute.manager [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] [instance: ce3bd633-4538-428d-9258-9222c3c72edd] Took 0.06 seconds to deallocate network for instance. [ 729.268054] env[59659]: DEBUG nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 729.305375] env[59659]: DEBUG nova.virt.hardware [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 729.305678] env[59659]: DEBUG nova.virt.hardware [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 729.305885] env[59659]: DEBUG nova.virt.hardware [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 729.306335] env[59659]: DEBUG nova.virt.hardware [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 729.306523] env[59659]: DEBUG nova.virt.hardware [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 729.306714] env[59659]: DEBUG nova.virt.hardware [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 729.306996] env[59659]: DEBUG nova.virt.hardware [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 729.309839] env[59659]: DEBUG nova.virt.hardware [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 729.309839] env[59659]: DEBUG nova.virt.hardware [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 729.309839] env[59659]: DEBUG nova.virt.hardware [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 729.309839] env[59659]: DEBUG nova.virt.hardware [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 729.309839] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75207ca4-697e-4758-b6a4-fe2b71565097 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.320060] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20a651d9-9252-494b-a546-a2a593bc0d23 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.327319] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e48d2cac-e10e-431c-b777-7d012d7bafcf {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.343726] env[59659]: DEBUG nova.policy [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5efb0e88d47b4a648b96635f0901c069', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'aa0980bc89de477bbd5ad4ad8f6ce2b4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 729.348421] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc0961d3-a9d7-4166-b84d-0661373379db {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.385466] env[59659]: INFO nova.scheduler.client.report [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Deleted allocations for instance ce3bd633-4538-428d-9258-9222c3c72edd [ 729.391545] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db9cebff-df30-4b44-8bcc-e697a048e341 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.400876] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf38f393-5d5a-4111-9292-95899a71b149 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.415265] env[59659]: DEBUG nova.compute.provider_tree [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 729.416379] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a23dfe6a-4389-4ef3-ba9e-9ebf54fba058 tempest-InstanceActionsNegativeTestJSON-307740395 tempest-InstanceActionsNegativeTestJSON-307740395-project-member] Lock "ce3bd633-4538-428d-9258-9222c3c72edd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.512s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.429059] env[59659]: DEBUG nova.scheduler.client.report [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 729.443511] env[59659]: DEBUG oslo_concurrency.lockutils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.307s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 729.443511] env[59659]: ERROR nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port dbbd873b-748b-4380-b79b-0889fea0b6d1, please check neutron logs for more information. [ 729.443511] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Traceback (most recent call last): [ 729.443511] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 729.443511] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] self.driver.spawn(context, instance, image_meta, [ 729.443511] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 729.443511] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] self._vmops.spawn(context, instance, image_meta, injected_files, [ 729.443511] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 729.443511] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] vm_ref = self.build_virtual_machine(instance, [ 729.443830] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 729.443830] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] vif_infos = vmwarevif.get_vif_info(self._session, [ 729.443830] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 729.443830] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] for vif in network_info: [ 729.443830] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 729.443830] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] return self._sync_wrapper(fn, *args, **kwargs) [ 729.443830] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 729.443830] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] self.wait() [ 729.443830] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 729.443830] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] self[:] = self._gt.wait() [ 729.443830] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 729.443830] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] return self._exit_event.wait() [ 729.443830] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 729.443830] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] result = hub.switch() [ 729.444722] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 729.444722] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] return self.greenlet.switch() [ 729.444722] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 729.444722] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] result = function(*args, **kwargs) [ 729.444722] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 729.444722] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] return func(*args, **kwargs) [ 729.444722] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 729.444722] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] raise e [ 729.444722] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 729.444722] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] nwinfo = self.network_api.allocate_for_instance( [ 729.444722] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.444722] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] created_port_ids = self._update_ports_for_instance( [ 729.444722] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.445267] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] with excutils.save_and_reraise_exception(): [ 729.445267] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.445267] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] self.force_reraise() [ 729.445267] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.445267] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] raise self.value [ 729.445267] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.445267] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] updated_port = self._update_port( [ 729.445267] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.445267] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] _ensure_no_port_binding_failure(port) [ 729.445267] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.445267] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] raise exception.PortBindingFailed(port_id=port['id']) [ 729.445267] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] nova.exception.PortBindingFailed: Binding failed for port dbbd873b-748b-4380-b79b-0889fea0b6d1, please check neutron logs for more information. [ 729.445267] env[59659]: ERROR nova.compute.manager [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] [ 729.446132] env[59659]: DEBUG nova.compute.utils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Binding failed for port dbbd873b-748b-4380-b79b-0889fea0b6d1, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 729.446132] env[59659]: DEBUG nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Build of instance 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998 was re-scheduled: Binding failed for port dbbd873b-748b-4380-b79b-0889fea0b6d1, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 729.446132] env[59659]: DEBUG nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 729.446306] env[59659]: DEBUG oslo_concurrency.lockutils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Acquiring lock "refresh_cache-7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 729.446441] env[59659]: DEBUG oslo_concurrency.lockutils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Acquired lock "refresh_cache-7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 729.446596] env[59659]: DEBUG nova.network.neutron [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 729.533352] env[59659]: DEBUG nova.network.neutron [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 729.624326] env[59659]: ERROR nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 8eed8767-c291-4cbf-8803-07bd0caa822b, please check neutron logs for more information. [ 729.624326] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 729.624326] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 729.624326] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 729.624326] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.624326] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 729.624326] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.624326] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 729.624326] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.624326] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 729.624326] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.624326] env[59659]: ERROR nova.compute.manager raise self.value [ 729.624326] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.624326] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 729.624326] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.624326] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 729.624804] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.624804] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 729.624804] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 8eed8767-c291-4cbf-8803-07bd0caa822b, please check neutron logs for more information. [ 729.624804] env[59659]: ERROR nova.compute.manager [ 729.624804] env[59659]: Traceback (most recent call last): [ 729.624804] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 729.624804] env[59659]: listener.cb(fileno) [ 729.624804] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 729.624804] env[59659]: result = function(*args, **kwargs) [ 729.624804] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 729.624804] env[59659]: return func(*args, **kwargs) [ 729.624804] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 729.624804] env[59659]: raise e [ 729.624804] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 729.624804] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 729.624804] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.624804] env[59659]: created_port_ids = self._update_ports_for_instance( [ 729.624804] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.624804] env[59659]: with excutils.save_and_reraise_exception(): [ 729.624804] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.624804] env[59659]: self.force_reraise() [ 729.624804] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.624804] env[59659]: raise self.value [ 729.624804] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.624804] env[59659]: updated_port = self._update_port( [ 729.624804] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.624804] env[59659]: _ensure_no_port_binding_failure(port) [ 729.624804] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.624804] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 729.625808] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 8eed8767-c291-4cbf-8803-07bd0caa822b, please check neutron logs for more information. [ 729.625808] env[59659]: Removing descriptor: 17 [ 729.625808] env[59659]: ERROR nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 8eed8767-c291-4cbf-8803-07bd0caa822b, please check neutron logs for more information. [ 729.625808] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] Traceback (most recent call last): [ 729.625808] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 729.625808] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] yield resources [ 729.625808] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 729.625808] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] self.driver.spawn(context, instance, image_meta, [ 729.625808] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 729.625808] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] self._vmops.spawn(context, instance, image_meta, injected_files, [ 729.625808] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 729.625808] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] vm_ref = self.build_virtual_machine(instance, [ 729.626143] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 729.626143] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] vif_infos = vmwarevif.get_vif_info(self._session, [ 729.626143] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 729.626143] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] for vif in network_info: [ 729.626143] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 729.626143] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] return self._sync_wrapper(fn, *args, **kwargs) [ 729.626143] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 729.626143] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] self.wait() [ 729.626143] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 729.626143] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] self[:] = self._gt.wait() [ 729.626143] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 729.626143] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] return self._exit_event.wait() [ 729.626143] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 729.626143] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] result = hub.switch() [ 729.626524] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 729.626524] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] return self.greenlet.switch() [ 729.626524] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 729.626524] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] result = function(*args, **kwargs) [ 729.626524] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 729.626524] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] return func(*args, **kwargs) [ 729.626524] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 729.626524] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] raise e [ 729.626524] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 729.626524] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] nwinfo = self.network_api.allocate_for_instance( [ 729.626524] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.626524] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] created_port_ids = self._update_ports_for_instance( [ 729.626524] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.626854] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] with excutils.save_and_reraise_exception(): [ 729.626854] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.626854] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] self.force_reraise() [ 729.626854] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.626854] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] raise self.value [ 729.626854] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.626854] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] updated_port = self._update_port( [ 729.626854] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.626854] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] _ensure_no_port_binding_failure(port) [ 729.626854] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.626854] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] raise exception.PortBindingFailed(port_id=port['id']) [ 729.626854] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] nova.exception.PortBindingFailed: Binding failed for port 8eed8767-c291-4cbf-8803-07bd0caa822b, please check neutron logs for more information. [ 729.626854] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] [ 729.627410] env[59659]: INFO nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Terminating instance [ 729.628355] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquiring lock "refresh_cache-f7fc4465-02a5-4715-b50b-04172f097350" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 729.628514] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquired lock "refresh_cache-f7fc4465-02a5-4715-b50b-04172f097350" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 729.628680] env[59659]: DEBUG nova.network.neutron [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 729.694422] env[59659]: ERROR nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 66b70acf-8cb8-4462-b90d-f49fb9026d15, please check neutron logs for more information. [ 729.694422] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 729.694422] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 729.694422] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 729.694422] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.694422] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 729.694422] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.694422] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 729.694422] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.694422] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 729.694422] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.694422] env[59659]: ERROR nova.compute.manager raise self.value [ 729.694422] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.694422] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 729.694422] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.694422] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 729.694879] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.694879] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 729.694879] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 66b70acf-8cb8-4462-b90d-f49fb9026d15, please check neutron logs for more information. [ 729.694879] env[59659]: ERROR nova.compute.manager [ 729.694879] env[59659]: Traceback (most recent call last): [ 729.694879] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 729.694879] env[59659]: listener.cb(fileno) [ 729.694879] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 729.694879] env[59659]: result = function(*args, **kwargs) [ 729.694879] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 729.694879] env[59659]: return func(*args, **kwargs) [ 729.694879] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 729.694879] env[59659]: raise e [ 729.694879] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 729.694879] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 729.694879] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.694879] env[59659]: created_port_ids = self._update_ports_for_instance( [ 729.694879] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.694879] env[59659]: with excutils.save_and_reraise_exception(): [ 729.694879] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.694879] env[59659]: self.force_reraise() [ 729.694879] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.694879] env[59659]: raise self.value [ 729.694879] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.694879] env[59659]: updated_port = self._update_port( [ 729.694879] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.694879] env[59659]: _ensure_no_port_binding_failure(port) [ 729.694879] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.694879] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 729.695623] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 66b70acf-8cb8-4462-b90d-f49fb9026d15, please check neutron logs for more information. [ 729.695623] env[59659]: Removing descriptor: 21 [ 729.695623] env[59659]: ERROR nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 66b70acf-8cb8-4462-b90d-f49fb9026d15, please check neutron logs for more information. [ 729.695623] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Traceback (most recent call last): [ 729.695623] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 729.695623] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] yield resources [ 729.695623] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 729.695623] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] self.driver.spawn(context, instance, image_meta, [ 729.695623] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 729.695623] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] self._vmops.spawn(context, instance, image_meta, injected_files, [ 729.695623] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 729.695623] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] vm_ref = self.build_virtual_machine(instance, [ 729.695936] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 729.695936] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] vif_infos = vmwarevif.get_vif_info(self._session, [ 729.695936] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 729.695936] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] for vif in network_info: [ 729.695936] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 729.695936] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] return self._sync_wrapper(fn, *args, **kwargs) [ 729.695936] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 729.695936] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] self.wait() [ 729.695936] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 729.695936] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] self[:] = self._gt.wait() [ 729.695936] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 729.695936] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] return self._exit_event.wait() [ 729.695936] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 729.695936] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] result = hub.switch() [ 729.696427] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 729.696427] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] return self.greenlet.switch() [ 729.696427] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 729.696427] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] result = function(*args, **kwargs) [ 729.696427] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 729.696427] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] return func(*args, **kwargs) [ 729.696427] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 729.696427] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] raise e [ 729.696427] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 729.696427] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] nwinfo = self.network_api.allocate_for_instance( [ 729.696427] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 729.696427] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] created_port_ids = self._update_ports_for_instance( [ 729.696427] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 729.696794] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] with excutils.save_and_reraise_exception(): [ 729.696794] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 729.696794] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] self.force_reraise() [ 729.696794] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 729.696794] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] raise self.value [ 729.696794] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 729.696794] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] updated_port = self._update_port( [ 729.696794] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 729.696794] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] _ensure_no_port_binding_failure(port) [ 729.696794] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 729.696794] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] raise exception.PortBindingFailed(port_id=port['id']) [ 729.696794] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] nova.exception.PortBindingFailed: Binding failed for port 66b70acf-8cb8-4462-b90d-f49fb9026d15, please check neutron logs for more information. [ 729.696794] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] [ 729.697253] env[59659]: INFO nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Terminating instance [ 729.701486] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Acquiring lock "refresh_cache-588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 729.701486] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Acquired lock "refresh_cache-588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 729.701486] env[59659]: DEBUG nova.network.neutron [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 729.722051] env[59659]: DEBUG nova.network.neutron [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 729.772331] env[59659]: DEBUG nova.network.neutron [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 729.893535] env[59659]: DEBUG nova.network.neutron [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.903119] env[59659]: DEBUG oslo_concurrency.lockutils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Releasing lock "refresh_cache-7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 729.903342] env[59659]: DEBUG nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 729.903522] env[59659]: DEBUG nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 729.903679] env[59659]: DEBUG nova.network.neutron [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 729.946489] env[59659]: DEBUG nova.network.neutron [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 729.954478] env[59659]: DEBUG nova.network.neutron [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.964543] env[59659]: INFO nova.compute.manager [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998] Took 0.06 seconds to deallocate network for instance. [ 730.060030] env[59659]: INFO nova.scheduler.client.report [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Deleted allocations for instance 7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998 [ 730.077583] env[59659]: DEBUG oslo_concurrency.lockutils [None req-333acd96-25f6-4d5e-b8a6-27bcdb473198 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Lock "7b7c9dcb-3ddd-419e-a6fa-6b37b01cc998" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.645s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.417315] env[59659]: DEBUG nova.network.neutron [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.431019] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Releasing lock "refresh_cache-f7fc4465-02a5-4715-b50b-04172f097350" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 730.431019] env[59659]: DEBUG nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 730.431019] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 730.431019] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b4f4be18-67a8-43d2-a483-115764d57eb6 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.439892] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a84e5104-2a2a-460a-8abe-1807ab0e8186 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.465612] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f7fc4465-02a5-4715-b50b-04172f097350 could not be found. [ 730.465928] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 730.466183] env[59659]: INFO nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Took 0.04 seconds to destroy the instance on the hypervisor. [ 730.466860] env[59659]: DEBUG oslo.service.loopingcall [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 730.468906] env[59659]: DEBUG nova.compute.manager [-] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 730.468906] env[59659]: DEBUG nova.network.neutron [-] [instance: f7fc4465-02a5-4715-b50b-04172f097350] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 730.472996] env[59659]: DEBUG nova.network.neutron [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.488451] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Releasing lock "refresh_cache-588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 730.488850] env[59659]: DEBUG nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 730.489041] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 730.489545] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d5ea4ff0-b62e-4609-8559-b773b385d770 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.498627] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4ce7e20-2d63-4a7c-994d-dc678256ce24 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.509494] env[59659]: DEBUG nova.network.neutron [-] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 730.525416] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06 could not be found. [ 730.525761] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 730.525977] env[59659]: INFO nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Took 0.04 seconds to destroy the instance on the hypervisor. [ 730.526242] env[59659]: DEBUG oslo.service.loopingcall [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 730.526479] env[59659]: DEBUG nova.network.neutron [-] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.527614] env[59659]: DEBUG nova.compute.manager [-] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 730.527720] env[59659]: DEBUG nova.network.neutron [-] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 730.540026] env[59659]: INFO nova.compute.manager [-] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Took 0.07 seconds to deallocate network for instance. [ 730.541996] env[59659]: DEBUG nova.compute.claims [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 730.542174] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.542378] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.565633] env[59659]: DEBUG nova.network.neutron [-] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 730.584168] env[59659]: DEBUG nova.network.neutron [-] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.593642] env[59659]: INFO nova.compute.manager [-] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Took 0.06 seconds to deallocate network for instance. [ 730.594384] env[59659]: DEBUG nova.compute.claims [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 730.594955] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.621414] env[59659]: DEBUG nova.network.neutron [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Successfully created port: ef1cd5c0-1945-454d-a3f4-40d46a56364f {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 730.690063] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed69301d-07d6-46bf-9f9a-4b38ee2de3d0 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.698170] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9bdb07c-a841-4cf8-a007-b9c4e4be8412 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.731152] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cd1d8ee-b81a-4437-97e5-c5bb92d5dfb3 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.741124] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa2e1e39-6b7f-4ead-9514-3140f2c8236e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.755714] env[59659]: DEBUG nova.compute.provider_tree [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 730.767073] env[59659]: DEBUG nova.scheduler.client.report [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 730.787800] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.245s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.788466] env[59659]: ERROR nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 8eed8767-c291-4cbf-8803-07bd0caa822b, please check neutron logs for more information. [ 730.788466] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] Traceback (most recent call last): [ 730.788466] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 730.788466] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] self.driver.spawn(context, instance, image_meta, [ 730.788466] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 730.788466] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] self._vmops.spawn(context, instance, image_meta, injected_files, [ 730.788466] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 730.788466] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] vm_ref = self.build_virtual_machine(instance, [ 730.788466] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 730.788466] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] vif_infos = vmwarevif.get_vif_info(self._session, [ 730.788466] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 730.788832] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] for vif in network_info: [ 730.788832] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 730.788832] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] return self._sync_wrapper(fn, *args, **kwargs) [ 730.788832] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 730.788832] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] self.wait() [ 730.788832] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 730.788832] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] self[:] = self._gt.wait() [ 730.788832] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 730.788832] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] return self._exit_event.wait() [ 730.788832] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 730.788832] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] result = hub.switch() [ 730.788832] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 730.788832] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] return self.greenlet.switch() [ 730.788832] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 730.789169] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] result = function(*args, **kwargs) [ 730.789169] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 730.789169] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] return func(*args, **kwargs) [ 730.789169] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 730.789169] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] raise e [ 730.789169] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 730.789169] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] nwinfo = self.network_api.allocate_for_instance( [ 730.789169] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 730.789169] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] created_port_ids = self._update_ports_for_instance( [ 730.789169] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 730.789169] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] with excutils.save_and_reraise_exception(): [ 730.789169] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 730.789169] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] self.force_reraise() [ 730.789530] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 730.789530] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] raise self.value [ 730.789530] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 730.789530] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] updated_port = self._update_port( [ 730.789530] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 730.789530] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] _ensure_no_port_binding_failure(port) [ 730.789530] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 730.789530] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] raise exception.PortBindingFailed(port_id=port['id']) [ 730.789530] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] nova.exception.PortBindingFailed: Binding failed for port 8eed8767-c291-4cbf-8803-07bd0caa822b, please check neutron logs for more information. [ 730.789530] env[59659]: ERROR nova.compute.manager [instance: f7fc4465-02a5-4715-b50b-04172f097350] [ 730.789530] env[59659]: DEBUG nova.compute.utils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Binding failed for port 8eed8767-c291-4cbf-8803-07bd0caa822b, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 730.791197] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.196s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.799022] env[59659]: DEBUG nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Build of instance f7fc4465-02a5-4715-b50b-04172f097350 was re-scheduled: Binding failed for port 8eed8767-c291-4cbf-8803-07bd0caa822b, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 730.799022] env[59659]: DEBUG nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 730.799022] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquiring lock "refresh_cache-f7fc4465-02a5-4715-b50b-04172f097350" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 730.799022] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquired lock "refresh_cache-f7fc4465-02a5-4715-b50b-04172f097350" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 730.799230] env[59659]: DEBUG nova.network.neutron [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 730.832018] env[59659]: DEBUG oslo_concurrency.lockutils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Acquiring lock "ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.832018] env[59659]: DEBUG oslo_concurrency.lockutils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Lock "ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.845031] env[59659]: DEBUG nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 730.850104] env[59659]: DEBUG nova.network.neutron [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 730.912467] env[59659]: DEBUG oslo_concurrency.lockutils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.956253] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f553e439-a927-4128-aa0a-ae8f55c6b7fd {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.966697] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d48a1f7-22e0-4ac1-9b2f-f318218f3dcc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.997166] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dfb9441-bba0-4471-8a4b-10ca67f03bc7 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.004954] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36c56ed9-651e-4857-b1c5-a9d950848023 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.018696] env[59659]: DEBUG nova.compute.provider_tree [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 731.027276] env[59659]: DEBUG nova.scheduler.client.report [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 731.042576] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.252s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.043768] env[59659]: ERROR nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 66b70acf-8cb8-4462-b90d-f49fb9026d15, please check neutron logs for more information. [ 731.043768] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Traceback (most recent call last): [ 731.043768] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 731.043768] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] self.driver.spawn(context, instance, image_meta, [ 731.043768] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 731.043768] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] self._vmops.spawn(context, instance, image_meta, injected_files, [ 731.043768] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 731.043768] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] vm_ref = self.build_virtual_machine(instance, [ 731.043768] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 731.043768] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] vif_infos = vmwarevif.get_vif_info(self._session, [ 731.043768] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 731.044202] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] for vif in network_info: [ 731.044202] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 731.044202] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] return self._sync_wrapper(fn, *args, **kwargs) [ 731.044202] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 731.044202] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] self.wait() [ 731.044202] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 731.044202] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] self[:] = self._gt.wait() [ 731.044202] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 731.044202] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] return self._exit_event.wait() [ 731.044202] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 731.044202] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] result = hub.switch() [ 731.044202] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 731.044202] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] return self.greenlet.switch() [ 731.044202] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 731.044537] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] result = function(*args, **kwargs) [ 731.044537] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 731.044537] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] return func(*args, **kwargs) [ 731.044537] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 731.044537] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] raise e [ 731.044537] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 731.044537] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] nwinfo = self.network_api.allocate_for_instance( [ 731.044537] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 731.044537] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] created_port_ids = self._update_ports_for_instance( [ 731.044537] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 731.044537] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] with excutils.save_and_reraise_exception(): [ 731.044537] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 731.044537] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] self.force_reraise() [ 731.044883] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 731.044883] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] raise self.value [ 731.044883] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 731.044883] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] updated_port = self._update_port( [ 731.044883] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 731.044883] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] _ensure_no_port_binding_failure(port) [ 731.044883] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 731.044883] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] raise exception.PortBindingFailed(port_id=port['id']) [ 731.044883] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] nova.exception.PortBindingFailed: Binding failed for port 66b70acf-8cb8-4462-b90d-f49fb9026d15, please check neutron logs for more information. [ 731.044883] env[59659]: ERROR nova.compute.manager [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] [ 731.044883] env[59659]: DEBUG nova.compute.utils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Binding failed for port 66b70acf-8cb8-4462-b90d-f49fb9026d15, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 731.045180] env[59659]: DEBUG oslo_concurrency.lockutils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.136s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.046432] env[59659]: INFO nova.compute.claims [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 731.049058] env[59659]: DEBUG nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Build of instance 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06 was re-scheduled: Binding failed for port 66b70acf-8cb8-4462-b90d-f49fb9026d15, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 731.049500] env[59659]: DEBUG nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 731.049718] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Acquiring lock "refresh_cache-588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 731.049859] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Acquired lock "refresh_cache-588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 731.050102] env[59659]: DEBUG nova.network.neutron [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 731.179831] env[59659]: DEBUG nova.network.neutron [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 731.215379] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c473ebdd-e1c3-4776-86e3-160fe3a89bf1 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.224041] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d4cf0ee-47da-4f34-a215-2e041668ec22 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.260965] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e74dddc-0bc2-470d-9d9d-3284507f0610 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.268952] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ee7d803-6523-4aa5-a750-dc781c5991de {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.285918] env[59659]: DEBUG nova.compute.provider_tree [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 731.298693] env[59659]: DEBUG nova.scheduler.client.report [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 731.312556] env[59659]: DEBUG oslo_concurrency.lockutils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.313034] env[59659]: DEBUG nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 731.355406] env[59659]: DEBUG nova.compute.utils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 731.356895] env[59659]: DEBUG nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 731.356895] env[59659]: DEBUG nova.network.neutron [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 731.368988] env[59659]: DEBUG nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 731.440192] env[59659]: DEBUG nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 731.464236] env[59659]: DEBUG nova.virt.hardware [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 731.464466] env[59659]: DEBUG nova.virt.hardware [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 731.464650] env[59659]: DEBUG nova.virt.hardware [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 731.464778] env[59659]: DEBUG nova.virt.hardware [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 731.464909] env[59659]: DEBUG nova.virt.hardware [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 731.465055] env[59659]: DEBUG nova.virt.hardware [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 731.465287] env[59659]: DEBUG nova.virt.hardware [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 731.465433] env[59659]: DEBUG nova.virt.hardware [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 731.465620] env[59659]: DEBUG nova.virt.hardware [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 731.466491] env[59659]: DEBUG nova.virt.hardware [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 731.466491] env[59659]: DEBUG nova.virt.hardware [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 731.466754] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdf0e698-a4ac-4452-8ad8-c644b511928d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.474978] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b39949be-80c3-41d8-bda4-8b377340c91c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.490671] env[59659]: DEBUG nova.network.neutron [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.503443] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Releasing lock "refresh_cache-f7fc4465-02a5-4715-b50b-04172f097350" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 731.503652] env[59659]: DEBUG nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 731.503831] env[59659]: DEBUG nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 731.503992] env[59659]: DEBUG nova.network.neutron [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 731.512849] env[59659]: DEBUG nova.policy [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9c848bf8c29f415e96eee55939826dde', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '319a39cba8574bc7a3c92d4527f59cad', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 731.545088] env[59659]: DEBUG nova.network.neutron [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 731.554211] env[59659]: DEBUG nova.network.neutron [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.564264] env[59659]: INFO nova.compute.manager [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: f7fc4465-02a5-4715-b50b-04172f097350] Took 0.06 seconds to deallocate network for instance. [ 731.570538] env[59659]: DEBUG nova.network.neutron [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.578842] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Releasing lock "refresh_cache-588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 731.579040] env[59659]: DEBUG nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 731.579223] env[59659]: DEBUG nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 731.579488] env[59659]: DEBUG nova.network.neutron [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 731.638568] env[59659]: DEBUG nova.network.neutron [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 731.649893] env[59659]: DEBUG nova.network.neutron [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.664245] env[59659]: INFO nova.compute.manager [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06] Took 0.08 seconds to deallocate network for instance. [ 731.676348] env[59659]: INFO nova.scheduler.client.report [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Deleted allocations for instance f7fc4465-02a5-4715-b50b-04172f097350 [ 731.709617] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bfee6349-6a74-41ed-8ad7-a6e50bc9516d tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "f7fc4465-02a5-4715-b50b-04172f097350" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.466s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.774400] env[59659]: INFO nova.scheduler.client.report [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Deleted allocations for instance 588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06 [ 731.795059] env[59659]: DEBUG oslo_concurrency.lockutils [None req-2185c308-83d0-4687-8494-f4cfcd93bb63 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Lock "588ad4c7-67fd-4d4f-ae0f-bb1ded40ef06" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.525s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 732.104600] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Acquiring lock "e90ee443-efe0-4f3e-999b-b9376e41fcb5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 732.104600] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Lock "e90ee443-efe0-4f3e-999b-b9376e41fcb5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 732.112189] env[59659]: ERROR nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 346f3b4a-8fe0-46e9-8eb3-97e76ce72c7f, please check neutron logs for more information. [ 732.112189] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 732.112189] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 732.112189] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 732.112189] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 732.112189] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 732.112189] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 732.112189] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 732.112189] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 732.112189] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 732.112189] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 732.112189] env[59659]: ERROR nova.compute.manager raise self.value [ 732.112189] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 732.112189] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 732.112189] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 732.112189] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 732.112648] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 732.112648] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 732.112648] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 346f3b4a-8fe0-46e9-8eb3-97e76ce72c7f, please check neutron logs for more information. [ 732.112648] env[59659]: ERROR nova.compute.manager [ 732.112746] env[59659]: Traceback (most recent call last): [ 732.112746] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 732.112746] env[59659]: listener.cb(fileno) [ 732.112746] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 732.112746] env[59659]: result = function(*args, **kwargs) [ 732.112746] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 732.112746] env[59659]: return func(*args, **kwargs) [ 732.112746] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 732.112746] env[59659]: raise e [ 732.112746] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 732.112746] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 732.112746] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 732.112746] env[59659]: created_port_ids = self._update_ports_for_instance( [ 732.112746] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 732.112746] env[59659]: with excutils.save_and_reraise_exception(): [ 732.112746] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 732.112746] env[59659]: self.force_reraise() [ 732.113165] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 732.113165] env[59659]: raise self.value [ 732.113165] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 732.113165] env[59659]: updated_port = self._update_port( [ 732.113165] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 732.113165] env[59659]: _ensure_no_port_binding_failure(port) [ 732.113165] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 732.113165] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 732.113165] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 346f3b4a-8fe0-46e9-8eb3-97e76ce72c7f, please check neutron logs for more information. [ 732.113165] env[59659]: Removing descriptor: 23 [ 732.113528] env[59659]: ERROR nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 346f3b4a-8fe0-46e9-8eb3-97e76ce72c7f, please check neutron logs for more information. [ 732.113528] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Traceback (most recent call last): [ 732.113528] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 732.113528] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] yield resources [ 732.113528] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 732.113528] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] self.driver.spawn(context, instance, image_meta, [ 732.113528] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 732.113528] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] self._vmops.spawn(context, instance, image_meta, injected_files, [ 732.113528] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 732.113528] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] vm_ref = self.build_virtual_machine(instance, [ 732.113528] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 732.113785] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] vif_infos = vmwarevif.get_vif_info(self._session, [ 732.113785] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 732.113785] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] for vif in network_info: [ 732.113785] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 732.113785] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] return self._sync_wrapper(fn, *args, **kwargs) [ 732.113785] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 732.113785] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] self.wait() [ 732.113785] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 732.113785] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] self[:] = self._gt.wait() [ 732.113785] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 732.113785] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] return self._exit_event.wait() [ 732.113785] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 732.113785] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] result = hub.switch() [ 732.113785] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 732.114170] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] return self.greenlet.switch() [ 732.114170] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 732.114170] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] result = function(*args, **kwargs) [ 732.114170] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 732.114170] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] return func(*args, **kwargs) [ 732.114170] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 732.114170] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] raise e [ 732.114170] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 732.114170] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] nwinfo = self.network_api.allocate_for_instance( [ 732.114170] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 732.114170] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] created_port_ids = self._update_ports_for_instance( [ 732.114170] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 732.114170] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] with excutils.save_and_reraise_exception(): [ 732.114493] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 732.114493] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] self.force_reraise() [ 732.114493] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 732.114493] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] raise self.value [ 732.114493] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 732.114493] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] updated_port = self._update_port( [ 732.114493] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 732.114493] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] _ensure_no_port_binding_failure(port) [ 732.114493] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 732.114493] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] raise exception.PortBindingFailed(port_id=port['id']) [ 732.114493] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] nova.exception.PortBindingFailed: Binding failed for port 346f3b4a-8fe0-46e9-8eb3-97e76ce72c7f, please check neutron logs for more information. [ 732.114493] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] [ 732.114804] env[59659]: INFO nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Terminating instance [ 732.116358] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Acquiring lock "refresh_cache-ad58bbc3-1ec8-4567-ba07-c8161bcc8380" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 732.116528] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Acquired lock "refresh_cache-ad58bbc3-1ec8-4567-ba07-c8161bcc8380" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 732.116719] env[59659]: DEBUG nova.network.neutron [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 732.119716] env[59659]: DEBUG nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 732.170538] env[59659]: DEBUG nova.network.neutron [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 732.179904] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 732.180215] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 732.181844] env[59659]: INFO nova.compute.claims [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 732.353227] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b70364ff-c119-4e1f-828d-51ce49acddbc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.360790] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-307b5e6b-c5e4-45ee-bd97-2890f00e664d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.397072] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-787449d5-4d31-4b08-a612-73a4803cbe79 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.405058] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-224d87d7-995b-4771-ba6f-b7ef8ae34e2e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.421815] env[59659]: DEBUG nova.compute.provider_tree [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 732.430836] env[59659]: DEBUG nova.scheduler.client.report [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 732.446266] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.266s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 732.446786] env[59659]: DEBUG nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 732.485565] env[59659]: DEBUG nova.compute.utils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 732.486600] env[59659]: DEBUG nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 732.486752] env[59659]: DEBUG nova.network.neutron [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 732.500186] env[59659]: DEBUG nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 732.602736] env[59659]: DEBUG nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 732.626220] env[59659]: DEBUG nova.virt.hardware [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 732.626450] env[59659]: DEBUG nova.virt.hardware [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 732.626604] env[59659]: DEBUG nova.virt.hardware [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 732.626805] env[59659]: DEBUG nova.virt.hardware [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 732.626910] env[59659]: DEBUG nova.virt.hardware [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 732.627148] env[59659]: DEBUG nova.virt.hardware [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 732.627376] env[59659]: DEBUG nova.virt.hardware [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 732.627584] env[59659]: DEBUG nova.virt.hardware [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 732.627692] env[59659]: DEBUG nova.virt.hardware [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 732.627848] env[59659]: DEBUG nova.virt.hardware [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 732.628094] env[59659]: DEBUG nova.virt.hardware [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 732.628968] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03750e1a-5c96-4de6-9f8b-5dc9219cca9e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.637547] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b77c0544-8a0b-4f5d-8e4b-e70583213147 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.722185] env[59659]: DEBUG nova.policy [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0dcc88335cd04d18a843d54d7892e451', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cefaf2236f784faaa0f6d81a0ccda6f8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 732.780125] env[59659]: DEBUG nova.network.neutron [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 732.800796] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Releasing lock "refresh_cache-ad58bbc3-1ec8-4567-ba07-c8161bcc8380" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 732.800957] env[59659]: DEBUG nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 732.801127] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 732.801537] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-71f20580-1ec5-42ac-b60b-13468da60b1f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.814936] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b020a924-a8ba-4103-8df2-88e6e53524a7 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.852734] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ad58bbc3-1ec8-4567-ba07-c8161bcc8380 could not be found. [ 732.852734] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 732.852734] env[59659]: INFO nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Took 0.05 seconds to destroy the instance on the hypervisor. [ 732.852734] env[59659]: DEBUG oslo.service.loopingcall [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 732.852984] env[59659]: DEBUG nova.network.neutron [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Successfully created port: 54f30955-59af-40cf-b87d-43e5cf498d0d {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 732.854492] env[59659]: DEBUG nova.compute.manager [-] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 732.854581] env[59659]: DEBUG nova.network.neutron [-] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 732.958939] env[59659]: DEBUG nova.network.neutron [-] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 732.967294] env[59659]: DEBUG nova.network.neutron [-] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 732.979457] env[59659]: INFO nova.compute.manager [-] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Took 0.12 seconds to deallocate network for instance. [ 732.982440] env[59659]: DEBUG nova.compute.claims [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 732.982607] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 732.982814] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 733.128895] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-190d0705-1836-4a81-9c3d-28ca2ae3034f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.137283] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d65e98a-c2b3-42ff-b23d-47d627e6eab7 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.169745] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ac95825-0386-4eeb-865f-5e63ff906a6f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.179558] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1a18606-8999-4ff4-ac6b-806b4f5d4522 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.193772] env[59659]: DEBUG nova.compute.provider_tree [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 733.202322] env[59659]: DEBUG nova.scheduler.client.report [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 733.215926] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.233s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 733.216631] env[59659]: ERROR nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 346f3b4a-8fe0-46e9-8eb3-97e76ce72c7f, please check neutron logs for more information. [ 733.216631] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Traceback (most recent call last): [ 733.216631] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 733.216631] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] self.driver.spawn(context, instance, image_meta, [ 733.216631] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 733.216631] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] self._vmops.spawn(context, instance, image_meta, injected_files, [ 733.216631] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 733.216631] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] vm_ref = self.build_virtual_machine(instance, [ 733.216631] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 733.216631] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] vif_infos = vmwarevif.get_vif_info(self._session, [ 733.216631] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 733.216932] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] for vif in network_info: [ 733.216932] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 733.216932] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] return self._sync_wrapper(fn, *args, **kwargs) [ 733.216932] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 733.216932] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] self.wait() [ 733.216932] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 733.216932] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] self[:] = self._gt.wait() [ 733.216932] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 733.216932] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] return self._exit_event.wait() [ 733.216932] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 733.216932] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] result = hub.switch() [ 733.216932] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 733.216932] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] return self.greenlet.switch() [ 733.216932] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 733.217299] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] result = function(*args, **kwargs) [ 733.217299] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 733.217299] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] return func(*args, **kwargs) [ 733.217299] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 733.217299] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] raise e [ 733.217299] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 733.217299] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] nwinfo = self.network_api.allocate_for_instance( [ 733.217299] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 733.217299] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] created_port_ids = self._update_ports_for_instance( [ 733.217299] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 733.217299] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] with excutils.save_and_reraise_exception(): [ 733.217299] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 733.217299] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] self.force_reraise() [ 733.217618] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 733.217618] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] raise self.value [ 733.217618] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 733.217618] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] updated_port = self._update_port( [ 733.217618] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 733.217618] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] _ensure_no_port_binding_failure(port) [ 733.217618] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 733.217618] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] raise exception.PortBindingFailed(port_id=port['id']) [ 733.217618] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] nova.exception.PortBindingFailed: Binding failed for port 346f3b4a-8fe0-46e9-8eb3-97e76ce72c7f, please check neutron logs for more information. [ 733.217618] env[59659]: ERROR nova.compute.manager [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] [ 733.218554] env[59659]: DEBUG nova.compute.utils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Binding failed for port 346f3b4a-8fe0-46e9-8eb3-97e76ce72c7f, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 733.219737] env[59659]: DEBUG nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Build of instance ad58bbc3-1ec8-4567-ba07-c8161bcc8380 was re-scheduled: Binding failed for port 346f3b4a-8fe0-46e9-8eb3-97e76ce72c7f, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 733.220102] env[59659]: DEBUG nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 733.220373] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Acquiring lock "refresh_cache-ad58bbc3-1ec8-4567-ba07-c8161bcc8380" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 733.220654] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Acquired lock "refresh_cache-ad58bbc3-1ec8-4567-ba07-c8161bcc8380" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 733.220856] env[59659]: DEBUG nova.network.neutron [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 733.316046] env[59659]: DEBUG nova.network.neutron [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 734.127637] env[59659]: DEBUG nova.network.neutron [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.138237] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Releasing lock "refresh_cache-ad58bbc3-1ec8-4567-ba07-c8161bcc8380" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 734.138460] env[59659]: DEBUG nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 734.138637] env[59659]: DEBUG nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 734.138797] env[59659]: DEBUG nova.network.neutron [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 734.226426] env[59659]: DEBUG nova.network.neutron [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 734.234893] env[59659]: DEBUG nova.network.neutron [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.245258] env[59659]: INFO nova.compute.manager [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] [instance: ad58bbc3-1ec8-4567-ba07-c8161bcc8380] Took 0.11 seconds to deallocate network for instance. [ 734.370301] env[59659]: INFO nova.scheduler.client.report [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Deleted allocations for instance ad58bbc3-1ec8-4567-ba07-c8161bcc8380 [ 734.391356] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5dbb8e33-93eb-4bbc-9d22-46826868e029 tempest-AttachInterfacesTestJSON-113667225 tempest-AttachInterfacesTestJSON-113667225-project-member] Lock "ad58bbc3-1ec8-4567-ba07-c8161bcc8380" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.174s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 734.563055] env[59659]: ERROR nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port a63c96c0-e7e4-4bf2-9ea3-7ce0cbf6e40f, please check neutron logs for more information. [ 734.563055] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 734.563055] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 734.563055] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 734.563055] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 734.563055] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 734.563055] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 734.563055] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 734.563055] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 734.563055] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 734.563055] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 734.563055] env[59659]: ERROR nova.compute.manager raise self.value [ 734.563055] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 734.563055] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 734.563055] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 734.563055] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 734.563604] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 734.563604] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 734.563604] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port a63c96c0-e7e4-4bf2-9ea3-7ce0cbf6e40f, please check neutron logs for more information. [ 734.563604] env[59659]: ERROR nova.compute.manager [ 734.563604] env[59659]: Traceback (most recent call last): [ 734.563604] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 734.563604] env[59659]: listener.cb(fileno) [ 734.563604] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 734.563604] env[59659]: result = function(*args, **kwargs) [ 734.563604] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 734.563604] env[59659]: return func(*args, **kwargs) [ 734.563604] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 734.563604] env[59659]: raise e [ 734.563604] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 734.563604] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 734.563604] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 734.563604] env[59659]: created_port_ids = self._update_ports_for_instance( [ 734.563604] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 734.563604] env[59659]: with excutils.save_and_reraise_exception(): [ 734.563604] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 734.563604] env[59659]: self.force_reraise() [ 734.563604] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 734.563604] env[59659]: raise self.value [ 734.563604] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 734.563604] env[59659]: updated_port = self._update_port( [ 734.563604] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 734.563604] env[59659]: _ensure_no_port_binding_failure(port) [ 734.563604] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 734.563604] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 734.564330] env[59659]: nova.exception.PortBindingFailed: Binding failed for port a63c96c0-e7e4-4bf2-9ea3-7ce0cbf6e40f, please check neutron logs for more information. [ 734.564330] env[59659]: Removing descriptor: 14 [ 734.564330] env[59659]: ERROR nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port a63c96c0-e7e4-4bf2-9ea3-7ce0cbf6e40f, please check neutron logs for more information. [ 734.564330] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] Traceback (most recent call last): [ 734.564330] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 734.564330] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] yield resources [ 734.564330] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 734.564330] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] self.driver.spawn(context, instance, image_meta, [ 734.564330] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 734.564330] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] self._vmops.spawn(context, instance, image_meta, injected_files, [ 734.564330] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 734.564330] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] vm_ref = self.build_virtual_machine(instance, [ 734.564641] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 734.564641] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] vif_infos = vmwarevif.get_vif_info(self._session, [ 734.564641] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 734.564641] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] for vif in network_info: [ 734.564641] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 734.564641] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] return self._sync_wrapper(fn, *args, **kwargs) [ 734.564641] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 734.564641] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] self.wait() [ 734.564641] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 734.564641] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] self[:] = self._gt.wait() [ 734.564641] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 734.564641] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] return self._exit_event.wait() [ 734.564641] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 734.564641] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] result = hub.switch() [ 734.564991] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 734.564991] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] return self.greenlet.switch() [ 734.564991] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 734.564991] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] result = function(*args, **kwargs) [ 734.564991] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 734.564991] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] return func(*args, **kwargs) [ 734.564991] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 734.564991] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] raise e [ 734.564991] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 734.564991] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] nwinfo = self.network_api.allocate_for_instance( [ 734.564991] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 734.564991] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] created_port_ids = self._update_ports_for_instance( [ 734.564991] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 734.565315] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] with excutils.save_and_reraise_exception(): [ 734.565315] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 734.565315] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] self.force_reraise() [ 734.565315] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 734.565315] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] raise self.value [ 734.565315] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 734.565315] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] updated_port = self._update_port( [ 734.565315] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 734.565315] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] _ensure_no_port_binding_failure(port) [ 734.565315] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 734.565315] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] raise exception.PortBindingFailed(port_id=port['id']) [ 734.565315] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] nova.exception.PortBindingFailed: Binding failed for port a63c96c0-e7e4-4bf2-9ea3-7ce0cbf6e40f, please check neutron logs for more information. [ 734.565315] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] [ 734.565635] env[59659]: INFO nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Terminating instance [ 734.569141] env[59659]: DEBUG oslo_concurrency.lockutils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquiring lock "refresh_cache-a411c5e7-5a49-463e-b270-800e35a31188" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 734.569141] env[59659]: DEBUG oslo_concurrency.lockutils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquired lock "refresh_cache-a411c5e7-5a49-463e-b270-800e35a31188" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 734.569141] env[59659]: DEBUG nova.network.neutron [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 734.879341] env[59659]: DEBUG nova.network.neutron [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 735.047125] env[59659]: DEBUG nova.network.neutron [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Successfully created port: 48ed80ba-0203-40d7-adfe-9f69bcfe2e45 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 735.067929] env[59659]: ERROR nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port ef1cd5c0-1945-454d-a3f4-40d46a56364f, please check neutron logs for more information. [ 735.067929] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 735.067929] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 735.067929] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 735.067929] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 735.067929] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 735.067929] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 735.067929] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 735.067929] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 735.067929] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 735.067929] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 735.067929] env[59659]: ERROR nova.compute.manager raise self.value [ 735.067929] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 735.067929] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 735.067929] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 735.067929] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 735.068679] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 735.068679] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 735.068679] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port ef1cd5c0-1945-454d-a3f4-40d46a56364f, please check neutron logs for more information. [ 735.068679] env[59659]: ERROR nova.compute.manager [ 735.068679] env[59659]: Traceback (most recent call last): [ 735.068679] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 735.068679] env[59659]: listener.cb(fileno) [ 735.068679] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 735.068679] env[59659]: result = function(*args, **kwargs) [ 735.068679] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 735.068679] env[59659]: return func(*args, **kwargs) [ 735.068679] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 735.068679] env[59659]: raise e [ 735.068679] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 735.068679] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 735.068679] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 735.068679] env[59659]: created_port_ids = self._update_ports_for_instance( [ 735.068679] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 735.068679] env[59659]: with excutils.save_and_reraise_exception(): [ 735.068679] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 735.068679] env[59659]: self.force_reraise() [ 735.068679] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 735.068679] env[59659]: raise self.value [ 735.068679] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 735.068679] env[59659]: updated_port = self._update_port( [ 735.068679] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 735.068679] env[59659]: _ensure_no_port_binding_failure(port) [ 735.068679] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 735.068679] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 735.069844] env[59659]: nova.exception.PortBindingFailed: Binding failed for port ef1cd5c0-1945-454d-a3f4-40d46a56364f, please check neutron logs for more information. [ 735.069844] env[59659]: Removing descriptor: 12 [ 735.069844] env[59659]: ERROR nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port ef1cd5c0-1945-454d-a3f4-40d46a56364f, please check neutron logs for more information. [ 735.069844] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] Traceback (most recent call last): [ 735.069844] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 735.069844] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] yield resources [ 735.069844] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 735.069844] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] self.driver.spawn(context, instance, image_meta, [ 735.069844] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 735.069844] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] self._vmops.spawn(context, instance, image_meta, injected_files, [ 735.069844] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 735.069844] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] vm_ref = self.build_virtual_machine(instance, [ 735.070603] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 735.070603] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] vif_infos = vmwarevif.get_vif_info(self._session, [ 735.070603] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 735.070603] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] for vif in network_info: [ 735.070603] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 735.070603] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] return self._sync_wrapper(fn, *args, **kwargs) [ 735.070603] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 735.070603] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] self.wait() [ 735.070603] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 735.070603] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] self[:] = self._gt.wait() [ 735.070603] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 735.070603] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] return self._exit_event.wait() [ 735.070603] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 735.070603] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] result = hub.switch() [ 735.071240] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 735.071240] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] return self.greenlet.switch() [ 735.071240] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 735.071240] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] result = function(*args, **kwargs) [ 735.071240] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 735.071240] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] return func(*args, **kwargs) [ 735.071240] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 735.071240] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] raise e [ 735.071240] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 735.071240] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] nwinfo = self.network_api.allocate_for_instance( [ 735.071240] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 735.071240] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] created_port_ids = self._update_ports_for_instance( [ 735.071240] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 735.071751] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] with excutils.save_and_reraise_exception(): [ 735.071751] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 735.071751] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] self.force_reraise() [ 735.071751] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 735.071751] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] raise self.value [ 735.071751] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 735.071751] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] updated_port = self._update_port( [ 735.071751] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 735.071751] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] _ensure_no_port_binding_failure(port) [ 735.071751] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 735.071751] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] raise exception.PortBindingFailed(port_id=port['id']) [ 735.071751] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] nova.exception.PortBindingFailed: Binding failed for port ef1cd5c0-1945-454d-a3f4-40d46a56364f, please check neutron logs for more information. [ 735.071751] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] [ 735.072367] env[59659]: INFO nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Terminating instance [ 735.072768] env[59659]: DEBUG oslo_concurrency.lockutils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Acquiring lock "refresh_cache-1670e7a3-656a-444d-85ed-292956498612" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 735.072925] env[59659]: DEBUG oslo_concurrency.lockutils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Acquired lock "refresh_cache-1670e7a3-656a-444d-85ed-292956498612" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 735.073484] env[59659]: DEBUG nova.network.neutron [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 735.153933] env[59659]: DEBUG nova.network.neutron [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 735.494347] env[59659]: DEBUG nova.network.neutron [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.503736] env[59659]: DEBUG oslo_concurrency.lockutils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Releasing lock "refresh_cache-a411c5e7-5a49-463e-b270-800e35a31188" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 735.504156] env[59659]: DEBUG nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 735.504351] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 735.504898] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ab91fa10-4ce9-49fb-8f13-0fb20f96a006 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.515462] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d48901a-4495-431d-ab97-43fdf2cc88dc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.540228] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a411c5e7-5a49-463e-b270-800e35a31188 could not be found. [ 735.540458] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 735.540633] env[59659]: INFO nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Took 0.04 seconds to destroy the instance on the hypervisor. [ 735.540878] env[59659]: DEBUG oslo.service.loopingcall [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 735.541154] env[59659]: DEBUG nova.compute.manager [-] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 735.541248] env[59659]: DEBUG nova.network.neutron [-] [instance: a411c5e7-5a49-463e-b270-800e35a31188] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 735.638970] env[59659]: DEBUG nova.network.neutron [-] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 735.648743] env[59659]: DEBUG nova.network.neutron [-] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.658495] env[59659]: INFO nova.compute.manager [-] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Took 0.12 seconds to deallocate network for instance. [ 735.662498] env[59659]: DEBUG nova.compute.claims [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 735.662675] env[59659]: DEBUG oslo_concurrency.lockutils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.662877] env[59659]: DEBUG oslo_concurrency.lockutils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.701782] env[59659]: DEBUG nova.network.neutron [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.713633] env[59659]: DEBUG oslo_concurrency.lockutils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Releasing lock "refresh_cache-1670e7a3-656a-444d-85ed-292956498612" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 735.714099] env[59659]: DEBUG nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 735.714290] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 735.714957] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c2b1b79f-b343-4e82-ad44-88d2dc282663 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.725613] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f61c65d9-a6a7-46a1-bfe2-1d8db49275bb {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.756955] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1670e7a3-656a-444d-85ed-292956498612 could not be found. [ 735.757460] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 735.757642] env[59659]: INFO nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Took 0.04 seconds to destroy the instance on the hypervisor. [ 735.757881] env[59659]: DEBUG oslo.service.loopingcall [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 735.762488] env[59659]: DEBUG nova.compute.manager [-] [instance: 1670e7a3-656a-444d-85ed-292956498612] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 735.762488] env[59659]: DEBUG nova.network.neutron [-] [instance: 1670e7a3-656a-444d-85ed-292956498612] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 735.810109] env[59659]: DEBUG nova.network.neutron [-] [instance: 1670e7a3-656a-444d-85ed-292956498612] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 735.815227] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a19d6ee-a703-4406-b3b4-e361eb671e8e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.822483] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16b8b57f-20d4-4341-95fe-8fcb99402729 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.825833] env[59659]: DEBUG nova.network.neutron [-] [instance: 1670e7a3-656a-444d-85ed-292956498612] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.859907] env[59659]: INFO nova.compute.manager [-] [instance: 1670e7a3-656a-444d-85ed-292956498612] Took 0.10 seconds to deallocate network for instance. [ 735.860780] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb85247c-e9bf-4e33-8b53-55282e8f2903 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.866197] env[59659]: DEBUG nova.compute.claims [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 735.866448] env[59659]: DEBUG oslo_concurrency.lockutils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.872865] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7bf18ba-9785-41a1-af4d-f5a2ab8143ff {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.889313] env[59659]: DEBUG nova.compute.provider_tree [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 735.896994] env[59659]: DEBUG nova.scheduler.client.report [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 735.913565] env[59659]: DEBUG oslo_concurrency.lockutils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.250s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.914218] env[59659]: ERROR nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port a63c96c0-e7e4-4bf2-9ea3-7ce0cbf6e40f, please check neutron logs for more information. [ 735.914218] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] Traceback (most recent call last): [ 735.914218] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 735.914218] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] self.driver.spawn(context, instance, image_meta, [ 735.914218] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 735.914218] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] self._vmops.spawn(context, instance, image_meta, injected_files, [ 735.914218] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 735.914218] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] vm_ref = self.build_virtual_machine(instance, [ 735.914218] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 735.914218] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] vif_infos = vmwarevif.get_vif_info(self._session, [ 735.914218] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 735.914557] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] for vif in network_info: [ 735.914557] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 735.914557] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] return self._sync_wrapper(fn, *args, **kwargs) [ 735.914557] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 735.914557] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] self.wait() [ 735.914557] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 735.914557] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] self[:] = self._gt.wait() [ 735.914557] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 735.914557] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] return self._exit_event.wait() [ 735.914557] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 735.914557] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] result = hub.switch() [ 735.914557] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 735.914557] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] return self.greenlet.switch() [ 735.914557] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 735.914946] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] result = function(*args, **kwargs) [ 735.914946] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 735.914946] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] return func(*args, **kwargs) [ 735.914946] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 735.914946] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] raise e [ 735.914946] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 735.914946] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] nwinfo = self.network_api.allocate_for_instance( [ 735.914946] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 735.914946] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] created_port_ids = self._update_ports_for_instance( [ 735.914946] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 735.914946] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] with excutils.save_and_reraise_exception(): [ 735.914946] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 735.914946] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] self.force_reraise() [ 735.915316] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 735.915316] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] raise self.value [ 735.915316] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 735.915316] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] updated_port = self._update_port( [ 735.915316] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 735.915316] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] _ensure_no_port_binding_failure(port) [ 735.915316] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 735.915316] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] raise exception.PortBindingFailed(port_id=port['id']) [ 735.915316] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] nova.exception.PortBindingFailed: Binding failed for port a63c96c0-e7e4-4bf2-9ea3-7ce0cbf6e40f, please check neutron logs for more information. [ 735.915316] env[59659]: ERROR nova.compute.manager [instance: a411c5e7-5a49-463e-b270-800e35a31188] [ 735.915316] env[59659]: DEBUG nova.compute.utils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Binding failed for port a63c96c0-e7e4-4bf2-9ea3-7ce0cbf6e40f, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 735.916265] env[59659]: DEBUG oslo_concurrency.lockutils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.050s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.922343] env[59659]: DEBUG nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Build of instance a411c5e7-5a49-463e-b270-800e35a31188 was re-scheduled: Binding failed for port a63c96c0-e7e4-4bf2-9ea3-7ce0cbf6e40f, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 735.922343] env[59659]: DEBUG nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 735.922343] env[59659]: DEBUG oslo_concurrency.lockutils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquiring lock "refresh_cache-a411c5e7-5a49-463e-b270-800e35a31188" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 735.922343] env[59659]: DEBUG oslo_concurrency.lockutils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquired lock "refresh_cache-a411c5e7-5a49-463e-b270-800e35a31188" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 735.922536] env[59659]: DEBUG nova.network.neutron [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 735.990733] env[59659]: DEBUG nova.network.neutron [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.038375] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcd78284-8ed5-4a3d-946a-3f065a472a90 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.046509] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83a885cc-d876-4190-ab2f-1549dfbfab61 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.081618] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd8620dc-eafd-4c64-b8aa-727c202751be {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.090366] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f1c165c-3ac5-4ff7-9583-c92b843f12ba {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.105436] env[59659]: DEBUG nova.compute.provider_tree [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 736.120128] env[59659]: DEBUG nova.scheduler.client.report [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 736.144042] env[59659]: DEBUG oslo_concurrency.lockutils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.226s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.144042] env[59659]: ERROR nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port ef1cd5c0-1945-454d-a3f4-40d46a56364f, please check neutron logs for more information. [ 736.144042] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] Traceback (most recent call last): [ 736.144042] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 736.144042] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] self.driver.spawn(context, instance, image_meta, [ 736.144042] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 736.144042] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] self._vmops.spawn(context, instance, image_meta, injected_files, [ 736.144042] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 736.144042] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] vm_ref = self.build_virtual_machine(instance, [ 736.144453] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 736.144453] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] vif_infos = vmwarevif.get_vif_info(self._session, [ 736.144453] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 736.144453] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] for vif in network_info: [ 736.144453] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 736.144453] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] return self._sync_wrapper(fn, *args, **kwargs) [ 736.144453] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 736.144453] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] self.wait() [ 736.144453] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 736.144453] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] self[:] = self._gt.wait() [ 736.144453] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 736.144453] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] return self._exit_event.wait() [ 736.144453] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 736.144453] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] result = hub.switch() [ 736.144841] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 736.144841] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] return self.greenlet.switch() [ 736.144841] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 736.144841] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] result = function(*args, **kwargs) [ 736.144841] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 736.144841] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] return func(*args, **kwargs) [ 736.144841] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 736.144841] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] raise e [ 736.144841] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 736.144841] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] nwinfo = self.network_api.allocate_for_instance( [ 736.144841] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 736.144841] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] created_port_ids = self._update_ports_for_instance( [ 736.144841] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 736.145157] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] with excutils.save_and_reraise_exception(): [ 736.145157] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 736.145157] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] self.force_reraise() [ 736.145157] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 736.145157] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] raise self.value [ 736.145157] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 736.145157] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] updated_port = self._update_port( [ 736.145157] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 736.145157] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] _ensure_no_port_binding_failure(port) [ 736.145157] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 736.145157] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] raise exception.PortBindingFailed(port_id=port['id']) [ 736.145157] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] nova.exception.PortBindingFailed: Binding failed for port ef1cd5c0-1945-454d-a3f4-40d46a56364f, please check neutron logs for more information. [ 736.145157] env[59659]: ERROR nova.compute.manager [instance: 1670e7a3-656a-444d-85ed-292956498612] [ 736.145638] env[59659]: DEBUG nova.compute.utils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Binding failed for port ef1cd5c0-1945-454d-a3f4-40d46a56364f, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 736.146562] env[59659]: DEBUG nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Build of instance 1670e7a3-656a-444d-85ed-292956498612 was re-scheduled: Binding failed for port ef1cd5c0-1945-454d-a3f4-40d46a56364f, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 736.146562] env[59659]: DEBUG nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 736.146716] env[59659]: DEBUG oslo_concurrency.lockutils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Acquiring lock "refresh_cache-1670e7a3-656a-444d-85ed-292956498612" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 736.146784] env[59659]: DEBUG oslo_concurrency.lockutils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Acquired lock "refresh_cache-1670e7a3-656a-444d-85ed-292956498612" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 736.146936] env[59659]: DEBUG nova.network.neutron [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 736.275049] env[59659]: DEBUG nova.network.neutron [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.424634] env[59659]: DEBUG nova.network.neutron [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.439580] env[59659]: DEBUG oslo_concurrency.lockutils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Releasing lock "refresh_cache-a411c5e7-5a49-463e-b270-800e35a31188" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 736.439808] env[59659]: DEBUG nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 736.439982] env[59659]: DEBUG nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 736.440176] env[59659]: DEBUG nova.network.neutron [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 736.521069] env[59659]: DEBUG nova.network.neutron [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.546196] env[59659]: DEBUG nova.network.neutron [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.559784] env[59659]: INFO nova.compute.manager [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: a411c5e7-5a49-463e-b270-800e35a31188] Took 0.12 seconds to deallocate network for instance. [ 736.658420] env[59659]: DEBUG oslo_concurrency.lockutils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Acquiring lock "508c2a14-5f5b-4968-843a-1378d1c46e2f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.662683] env[59659]: DEBUG oslo_concurrency.lockutils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Lock "508c2a14-5f5b-4968-843a-1378d1c46e2f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.004s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.675739] env[59659]: DEBUG nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 736.700721] env[59659]: INFO nova.scheduler.client.report [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Deleted allocations for instance a411c5e7-5a49-463e-b270-800e35a31188 [ 736.723348] env[59659]: DEBUG oslo_concurrency.lockutils [None req-07b21824-c498-423f-91e4-177f5a80294a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "a411c5e7-5a49-463e-b270-800e35a31188" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.951s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.735831] env[59659]: DEBUG oslo_concurrency.lockutils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.736087] env[59659]: DEBUG oslo_concurrency.lockutils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.739340] env[59659]: INFO nova.compute.claims [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 736.765925] env[59659]: DEBUG nova.network.neutron [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.780334] env[59659]: DEBUG oslo_concurrency.lockutils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Releasing lock "refresh_cache-1670e7a3-656a-444d-85ed-292956498612" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 736.780543] env[59659]: DEBUG nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 736.780719] env[59659]: DEBUG nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 736.780872] env[59659]: DEBUG nova.network.neutron [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 736.872326] env[59659]: DEBUG nova.network.neutron [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.881824] env[59659]: DEBUG nova.network.neutron [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.895143] env[59659]: INFO nova.compute.manager [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] [instance: 1670e7a3-656a-444d-85ed-292956498612] Took 0.11 seconds to deallocate network for instance. [ 736.928091] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-240593eb-5570-49cf-8f26-c77fe21fc1d7 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.937684] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebabbcce-6bab-4408-b3b8-aa730cd9f1f3 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.982014] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5776fd2f-008d-4520-8a54-f8e42c2dfcfc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.994978] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f63aec7f-99d0-491a-aafc-0a44e49d0b2d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 737.010424] env[59659]: DEBUG nova.compute.provider_tree [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 737.012303] env[59659]: INFO nova.scheduler.client.report [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Deleted allocations for instance 1670e7a3-656a-444d-85ed-292956498612 [ 737.020352] env[59659]: DEBUG nova.scheduler.client.report [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 737.032114] env[59659]: DEBUG oslo_concurrency.lockutils [None req-4eb49f41-f4bb-489a-b035-264871b492e1 tempest-ServerRescueTestJSONUnderV235-580767535 tempest-ServerRescueTestJSONUnderV235-580767535-project-member] Lock "1670e7a3-656a-444d-85ed-292956498612" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.294s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.055173] env[59659]: DEBUG oslo_concurrency.lockutils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.318s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 737.058265] env[59659]: DEBUG nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 737.099740] env[59659]: DEBUG nova.compute.utils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 737.100998] env[59659]: DEBUG nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 737.101215] env[59659]: DEBUG nova.network.neutron [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 737.116613] env[59659]: DEBUG nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 737.195985] env[59659]: DEBUG nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 737.221892] env[59659]: DEBUG nova.virt.hardware [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 737.221892] env[59659]: DEBUG nova.virt.hardware [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 737.221892] env[59659]: DEBUG nova.virt.hardware [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 737.223417] env[59659]: DEBUG nova.virt.hardware [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 737.223417] env[59659]: DEBUG nova.virt.hardware [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 737.223417] env[59659]: DEBUG nova.virt.hardware [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 737.223417] env[59659]: DEBUG nova.virt.hardware [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 737.223417] env[59659]: DEBUG nova.virt.hardware [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 737.223691] env[59659]: DEBUG nova.virt.hardware [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 737.227289] env[59659]: DEBUG nova.virt.hardware [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 737.227528] env[59659]: DEBUG nova.virt.hardware [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 737.228653] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-650e7e8b-b3fe-4d79-8b58-4ec307b7a46f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 737.237960] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d0b224f-fb9d-461a-81c5-5a63dd8a357d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 737.440032] env[59659]: DEBUG nova.policy [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd26bdcbd8994834ba745c00358233a6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '35f14b46451443d48ce9154de8a09045', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 738.652911] env[59659]: ERROR nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 54f30955-59af-40cf-b87d-43e5cf498d0d, please check neutron logs for more information. [ 738.652911] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 738.652911] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 738.652911] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 738.652911] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 738.652911] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 738.652911] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 738.652911] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 738.652911] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 738.652911] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 738.652911] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 738.652911] env[59659]: ERROR nova.compute.manager raise self.value [ 738.652911] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 738.652911] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 738.652911] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 738.652911] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 738.654626] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 738.654626] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 738.654626] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 54f30955-59af-40cf-b87d-43e5cf498d0d, please check neutron logs for more information. [ 738.654626] env[59659]: ERROR nova.compute.manager [ 738.654626] env[59659]: Traceback (most recent call last): [ 738.654626] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 738.654626] env[59659]: listener.cb(fileno) [ 738.654626] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 738.654626] env[59659]: result = function(*args, **kwargs) [ 738.654626] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 738.654626] env[59659]: return func(*args, **kwargs) [ 738.654626] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 738.654626] env[59659]: raise e [ 738.654626] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 738.654626] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 738.654626] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 738.654626] env[59659]: created_port_ids = self._update_ports_for_instance( [ 738.654626] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 738.654626] env[59659]: with excutils.save_and_reraise_exception(): [ 738.654626] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 738.654626] env[59659]: self.force_reraise() [ 738.654626] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 738.654626] env[59659]: raise self.value [ 738.654626] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 738.654626] env[59659]: updated_port = self._update_port( [ 738.654626] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 738.654626] env[59659]: _ensure_no_port_binding_failure(port) [ 738.654626] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 738.654626] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 738.656501] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 54f30955-59af-40cf-b87d-43e5cf498d0d, please check neutron logs for more information. [ 738.656501] env[59659]: Removing descriptor: 22 [ 738.656501] env[59659]: ERROR nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 54f30955-59af-40cf-b87d-43e5cf498d0d, please check neutron logs for more information. [ 738.656501] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Traceback (most recent call last): [ 738.656501] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 738.656501] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] yield resources [ 738.656501] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 738.656501] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] self.driver.spawn(context, instance, image_meta, [ 738.656501] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 738.656501] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] self._vmops.spawn(context, instance, image_meta, injected_files, [ 738.656501] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 738.656501] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] vm_ref = self.build_virtual_machine(instance, [ 738.657060] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 738.657060] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] vif_infos = vmwarevif.get_vif_info(self._session, [ 738.657060] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 738.657060] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] for vif in network_info: [ 738.657060] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 738.657060] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] return self._sync_wrapper(fn, *args, **kwargs) [ 738.657060] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 738.657060] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] self.wait() [ 738.657060] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 738.657060] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] self[:] = self._gt.wait() [ 738.657060] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 738.657060] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] return self._exit_event.wait() [ 738.657060] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 738.657060] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] result = hub.switch() [ 738.657679] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 738.657679] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] return self.greenlet.switch() [ 738.657679] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 738.657679] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] result = function(*args, **kwargs) [ 738.657679] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 738.657679] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] return func(*args, **kwargs) [ 738.657679] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 738.657679] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] raise e [ 738.657679] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 738.657679] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] nwinfo = self.network_api.allocate_for_instance( [ 738.657679] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 738.657679] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] created_port_ids = self._update_ports_for_instance( [ 738.657679] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 738.658424] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] with excutils.save_and_reraise_exception(): [ 738.658424] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 738.658424] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] self.force_reraise() [ 738.658424] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 738.658424] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] raise self.value [ 738.658424] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 738.658424] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] updated_port = self._update_port( [ 738.658424] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 738.658424] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] _ensure_no_port_binding_failure(port) [ 738.658424] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 738.658424] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] raise exception.PortBindingFailed(port_id=port['id']) [ 738.658424] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] nova.exception.PortBindingFailed: Binding failed for port 54f30955-59af-40cf-b87d-43e5cf498d0d, please check neutron logs for more information. [ 738.658424] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] [ 738.659803] env[59659]: INFO nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Terminating instance [ 738.662047] env[59659]: DEBUG oslo_concurrency.lockutils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Acquiring lock "refresh_cache-ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 738.662047] env[59659]: DEBUG oslo_concurrency.lockutils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Acquired lock "refresh_cache-ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 738.662047] env[59659]: DEBUG nova.network.neutron [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 738.719424] env[59659]: DEBUG nova.network.neutron [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 739.145855] env[59659]: DEBUG nova.network.neutron [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Successfully created port: 45008a60-2dcb-40dd-b9e0-f4d0e6137628 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 739.378749] env[59659]: DEBUG nova.network.neutron [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.387674] env[59659]: DEBUG oslo_concurrency.lockutils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Releasing lock "refresh_cache-ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 739.388091] env[59659]: DEBUG nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 739.388275] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 739.388802] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2e9ffc06-75f2-48c7-8eb5-cf3fb566d1db {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.398418] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02ec672d-3b76-4ae3-83c9-08e68f0c2765 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.435248] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24 could not be found. [ 739.435695] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 739.435834] env[59659]: INFO nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Took 0.05 seconds to destroy the instance on the hypervisor. [ 739.436148] env[59659]: DEBUG oslo.service.loopingcall [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 739.436516] env[59659]: DEBUG nova.compute.manager [-] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 739.436645] env[59659]: DEBUG nova.network.neutron [-] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 739.524712] env[59659]: DEBUG nova.network.neutron [-] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 739.534340] env[59659]: DEBUG nova.network.neutron [-] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.550124] env[59659]: INFO nova.compute.manager [-] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Took 0.11 seconds to deallocate network for instance. [ 739.556149] env[59659]: DEBUG nova.compute.claims [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 739.556357] env[59659]: DEBUG oslo_concurrency.lockutils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.556674] env[59659]: DEBUG oslo_concurrency.lockutils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.710738] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-564ba7d7-804d-4652-b9d5-63823ea274f5 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.718296] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80c97bb5-4a25-4ea2-8b8b-7cf5d2234992 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.750015] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-802a1044-8798-44cc-8a12-5a6eac9f7542 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.758509] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a1d7b16-ca4d-4d93-a870-503ddd8b56fb {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.772406] env[59659]: DEBUG nova.compute.provider_tree [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 739.783718] env[59659]: DEBUG nova.scheduler.client.report [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 739.802147] env[59659]: DEBUG oslo_concurrency.lockutils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.245s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.802803] env[59659]: ERROR nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 54f30955-59af-40cf-b87d-43e5cf498d0d, please check neutron logs for more information. [ 739.802803] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Traceback (most recent call last): [ 739.802803] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 739.802803] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] self.driver.spawn(context, instance, image_meta, [ 739.802803] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 739.802803] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] self._vmops.spawn(context, instance, image_meta, injected_files, [ 739.802803] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 739.802803] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] vm_ref = self.build_virtual_machine(instance, [ 739.802803] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 739.802803] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] vif_infos = vmwarevif.get_vif_info(self._session, [ 739.802803] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 739.803462] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] for vif in network_info: [ 739.803462] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 739.803462] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] return self._sync_wrapper(fn, *args, **kwargs) [ 739.803462] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 739.803462] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] self.wait() [ 739.803462] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 739.803462] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] self[:] = self._gt.wait() [ 739.803462] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 739.803462] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] return self._exit_event.wait() [ 739.803462] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 739.803462] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] result = hub.switch() [ 739.803462] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 739.803462] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] return self.greenlet.switch() [ 739.803462] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 739.804185] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] result = function(*args, **kwargs) [ 739.804185] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 739.804185] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] return func(*args, **kwargs) [ 739.804185] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 739.804185] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] raise e [ 739.804185] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 739.804185] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] nwinfo = self.network_api.allocate_for_instance( [ 739.804185] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 739.804185] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] created_port_ids = self._update_ports_for_instance( [ 739.804185] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 739.804185] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] with excutils.save_and_reraise_exception(): [ 739.804185] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 739.804185] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] self.force_reraise() [ 739.804636] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 739.804636] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] raise self.value [ 739.804636] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 739.804636] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] updated_port = self._update_port( [ 739.804636] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 739.804636] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] _ensure_no_port_binding_failure(port) [ 739.804636] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 739.804636] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] raise exception.PortBindingFailed(port_id=port['id']) [ 739.804636] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] nova.exception.PortBindingFailed: Binding failed for port 54f30955-59af-40cf-b87d-43e5cf498d0d, please check neutron logs for more information. [ 739.804636] env[59659]: ERROR nova.compute.manager [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] [ 739.804636] env[59659]: DEBUG nova.compute.utils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Binding failed for port 54f30955-59af-40cf-b87d-43e5cf498d0d, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 739.805494] env[59659]: DEBUG nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Build of instance ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24 was re-scheduled: Binding failed for port 54f30955-59af-40cf-b87d-43e5cf498d0d, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 739.805976] env[59659]: DEBUG nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 739.806224] env[59659]: DEBUG oslo_concurrency.lockutils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Acquiring lock "refresh_cache-ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 739.806565] env[59659]: DEBUG oslo_concurrency.lockutils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Acquired lock "refresh_cache-ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 739.806762] env[59659]: DEBUG nova.network.neutron [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 739.866286] env[59659]: DEBUG nova.network.neutron [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 740.449160] env[59659]: DEBUG nova.network.neutron [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 740.477259] env[59659]: DEBUG oslo_concurrency.lockutils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquiring lock "b8404801-b787-4db2-aa13-320f87ca5ac5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.477259] env[59659]: DEBUG oslo_concurrency.lockutils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "b8404801-b787-4db2-aa13-320f87ca5ac5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.479599] env[59659]: DEBUG oslo_concurrency.lockutils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Releasing lock "refresh_cache-ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 740.479599] env[59659]: DEBUG nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 740.479599] env[59659]: DEBUG nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 740.479599] env[59659]: DEBUG nova.network.neutron [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 740.496589] env[59659]: DEBUG nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 740.552073] env[59659]: DEBUG oslo_concurrency.lockutils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.552319] env[59659]: DEBUG oslo_concurrency.lockutils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.554033] env[59659]: INFO nova.compute.claims [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 740.557791] env[59659]: DEBUG nova.network.neutron [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 740.567882] env[59659]: DEBUG oslo_concurrency.lockutils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Acquiring lock "e2620c94-5629-4094-a92f-d83d9efd6205" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.568108] env[59659]: DEBUG oslo_concurrency.lockutils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Lock "e2620c94-5629-4094-a92f-d83d9efd6205" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.571682] env[59659]: DEBUG nova.network.neutron [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 740.581474] env[59659]: INFO nova.compute.manager [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] [instance: ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24] Took 0.10 seconds to deallocate network for instance. [ 740.605668] env[59659]: DEBUG nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 740.684407] env[59659]: DEBUG oslo_concurrency.lockutils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.704280] env[59659]: INFO nova.scheduler.client.report [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Deleted allocations for instance ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24 [ 740.730115] env[59659]: DEBUG oslo_concurrency.lockutils [None req-00ef82e1-26e6-453a-86ca-1f086efe3204 tempest-ServerMetadataTestJSON-382877116 tempest-ServerMetadataTestJSON-382877116-project-member] Lock "ee7de02d-e1c1-4fe7-8df9-e9f82a39ef24" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.897s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.753825] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c5b0b47-1aa4-4dd6-b589-0db9018b415f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.762423] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a8feabb-49db-45e8-bbfa-d8f1426aae13 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.796741] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae28cce5-db8c-4ac3-acae-735db8c06f11 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.805204] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0ca03aa-ea9b-4dee-813b-cf3b9e45dea2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.821563] env[59659]: DEBUG nova.compute.provider_tree [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 740.831447] env[59659]: DEBUG nova.scheduler.client.report [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 740.847167] env[59659]: DEBUG oslo_concurrency.lockutils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.295s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.847636] env[59659]: DEBUG nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 740.850314] env[59659]: DEBUG oslo_concurrency.lockutils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.166s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.851956] env[59659]: INFO nova.compute.claims [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 740.880371] env[59659]: DEBUG nova.compute.utils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 740.881616] env[59659]: DEBUG nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 740.881790] env[59659]: DEBUG nova.network.neutron [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 740.894829] env[59659]: DEBUG nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 740.971584] env[59659]: DEBUG nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 740.992923] env[59659]: DEBUG nova.virt.hardware [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 740.993171] env[59659]: DEBUG nova.virt.hardware [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 740.993320] env[59659]: DEBUG nova.virt.hardware [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 740.993495] env[59659]: DEBUG nova.virt.hardware [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 740.993669] env[59659]: DEBUG nova.virt.hardware [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 740.993821] env[59659]: DEBUG nova.virt.hardware [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 740.994876] env[59659]: DEBUG nova.virt.hardware [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 740.995221] env[59659]: DEBUG nova.virt.hardware [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 740.995420] env[59659]: DEBUG nova.virt.hardware [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 740.995590] env[59659]: DEBUG nova.virt.hardware [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 740.996142] env[59659]: DEBUG nova.virt.hardware [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 740.997124] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab070cbf-ca4e-4397-bc6c-2120b8353027 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.011228] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-144d21a6-0b32-410d-a82d-21709e64cdd5 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.028099] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c323264b-92af-4868-b95e-d963bbcf2bfc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.035464] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55465e8d-f151-42c0-89cc-b18c30b503d8 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.068743] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-727129ae-65c0-4858-9157-8fcfcb31e089 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.080188] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d766174-a5e9-478a-8920-087b25b0ea5c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.085229] env[59659]: DEBUG nova.policy [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8090dd0e116e4ac89aeb07e25bc22927', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8689e7ba4d544dfcbbdf7c864cb3f823', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 741.096900] env[59659]: DEBUG nova.compute.provider_tree [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 741.106319] env[59659]: DEBUG nova.scheduler.client.report [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 741.122286] env[59659]: DEBUG oslo_concurrency.lockutils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.272s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.123035] env[59659]: DEBUG nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 741.159458] env[59659]: DEBUG nova.compute.utils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 741.160767] env[59659]: DEBUG nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 741.160939] env[59659]: DEBUG nova.network.neutron [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 741.172823] env[59659]: DEBUG nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 741.261134] env[59659]: DEBUG nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 741.286345] env[59659]: DEBUG nova.virt.hardware [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 741.286580] env[59659]: DEBUG nova.virt.hardware [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 741.286728] env[59659]: DEBUG nova.virt.hardware [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 741.286903] env[59659]: DEBUG nova.virt.hardware [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 741.287053] env[59659]: DEBUG nova.virt.hardware [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 741.287198] env[59659]: DEBUG nova.virt.hardware [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 741.287398] env[59659]: DEBUG nova.virt.hardware [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 741.287550] env[59659]: DEBUG nova.virt.hardware [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 741.287709] env[59659]: DEBUG nova.virt.hardware [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 741.287870] env[59659]: DEBUG nova.virt.hardware [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 741.288036] env[59659]: DEBUG nova.virt.hardware [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 741.289173] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be1da7c2-4d26-474f-89a6-951c0985fc78 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.298444] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77b7aac7-928e-4b0b-af9f-dfbadf1c6318 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.626224] env[59659]: DEBUG nova.policy [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd13e7b6d63fc4aa4aaf5c066c9cbfcbf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '18959b240e3b49d6a94dc7aa1e92487c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 741.885857] env[59659]: ERROR nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 48ed80ba-0203-40d7-adfe-9f69bcfe2e45, please check neutron logs for more information. [ 741.885857] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 741.885857] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 741.885857] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 741.885857] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 741.885857] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 741.885857] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 741.885857] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 741.885857] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 741.885857] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 741.885857] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 741.885857] env[59659]: ERROR nova.compute.manager raise self.value [ 741.885857] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 741.885857] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 741.885857] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 741.885857] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 741.886482] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 741.886482] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 741.886482] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 48ed80ba-0203-40d7-adfe-9f69bcfe2e45, please check neutron logs for more information. [ 741.886482] env[59659]: ERROR nova.compute.manager [ 741.886482] env[59659]: Traceback (most recent call last): [ 741.886482] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 741.886482] env[59659]: listener.cb(fileno) [ 741.886482] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 741.886482] env[59659]: result = function(*args, **kwargs) [ 741.886482] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 741.886482] env[59659]: return func(*args, **kwargs) [ 741.886482] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 741.886482] env[59659]: raise e [ 741.886482] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 741.886482] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 741.886482] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 741.886482] env[59659]: created_port_ids = self._update_ports_for_instance( [ 741.886482] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 741.886482] env[59659]: with excutils.save_and_reraise_exception(): [ 741.886482] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 741.886482] env[59659]: self.force_reraise() [ 741.886482] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 741.886482] env[59659]: raise self.value [ 741.886482] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 741.886482] env[59659]: updated_port = self._update_port( [ 741.886482] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 741.886482] env[59659]: _ensure_no_port_binding_failure(port) [ 741.886482] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 741.886482] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 741.887576] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 48ed80ba-0203-40d7-adfe-9f69bcfe2e45, please check neutron logs for more information. [ 741.887576] env[59659]: Removing descriptor: 21 [ 741.887576] env[59659]: ERROR nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 48ed80ba-0203-40d7-adfe-9f69bcfe2e45, please check neutron logs for more information. [ 741.887576] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Traceback (most recent call last): [ 741.887576] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 741.887576] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] yield resources [ 741.887576] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 741.887576] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] self.driver.spawn(context, instance, image_meta, [ 741.887576] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 741.887576] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 741.887576] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 741.887576] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] vm_ref = self.build_virtual_machine(instance, [ 741.887888] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 741.887888] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] vif_infos = vmwarevif.get_vif_info(self._session, [ 741.887888] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 741.887888] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] for vif in network_info: [ 741.887888] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 741.887888] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] return self._sync_wrapper(fn, *args, **kwargs) [ 741.887888] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 741.887888] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] self.wait() [ 741.887888] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 741.887888] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] self[:] = self._gt.wait() [ 741.887888] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 741.887888] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] return self._exit_event.wait() [ 741.887888] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 741.887888] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] result = hub.switch() [ 741.888303] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 741.888303] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] return self.greenlet.switch() [ 741.888303] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 741.888303] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] result = function(*args, **kwargs) [ 741.888303] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 741.888303] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] return func(*args, **kwargs) [ 741.888303] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 741.888303] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] raise e [ 741.888303] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 741.888303] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] nwinfo = self.network_api.allocate_for_instance( [ 741.888303] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 741.888303] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] created_port_ids = self._update_ports_for_instance( [ 741.888303] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 741.888651] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] with excutils.save_and_reraise_exception(): [ 741.888651] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 741.888651] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] self.force_reraise() [ 741.888651] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 741.888651] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] raise self.value [ 741.888651] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 741.888651] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] updated_port = self._update_port( [ 741.888651] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 741.888651] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] _ensure_no_port_binding_failure(port) [ 741.888651] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 741.888651] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] raise exception.PortBindingFailed(port_id=port['id']) [ 741.888651] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] nova.exception.PortBindingFailed: Binding failed for port 48ed80ba-0203-40d7-adfe-9f69bcfe2e45, please check neutron logs for more information. [ 741.888651] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] [ 741.889019] env[59659]: INFO nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Terminating instance [ 741.894067] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Acquiring lock "refresh_cache-e90ee443-efe0-4f3e-999b-b9376e41fcb5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 741.894268] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Acquired lock "refresh_cache-e90ee443-efe0-4f3e-999b-b9376e41fcb5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 741.894438] env[59659]: DEBUG nova.network.neutron [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 741.953597] env[59659]: DEBUG nova.network.neutron [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 742.296103] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Acquiring lock "b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.296486] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Lock "b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 742.311763] env[59659]: DEBUG nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 742.322492] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Acquiring lock "f05a805d-7896-477c-b2ea-437faec88fba" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.322872] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Lock "f05a805d-7896-477c-b2ea-437faec88fba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 742.342020] env[59659]: DEBUG nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 742.391447] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.391699] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 742.393274] env[59659]: INFO nova.compute.claims [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 742.415652] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.562064] env[59659]: DEBUG nova.network.neutron [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 742.574625] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Releasing lock "refresh_cache-e90ee443-efe0-4f3e-999b-b9376e41fcb5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 742.574954] env[59659]: DEBUG nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 742.577193] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 742.577193] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1b36ad7d-2a95-46f0-9ce2-55e7823c9a75 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.588235] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2c53b61-86c0-41f7-b426-35e7d879b0e3 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.620891] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e90ee443-efe0-4f3e-999b-b9376e41fcb5 could not be found. [ 742.623067] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 742.623067] env[59659]: INFO nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Took 0.05 seconds to destroy the instance on the hypervisor. [ 742.623067] env[59659]: DEBUG oslo.service.loopingcall [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 742.624554] env[59659]: DEBUG nova.compute.manager [-] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 742.625775] env[59659]: DEBUG nova.network.neutron [-] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 742.628212] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ec0181a-b359-45cf-bb72-f58ab0e33984 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.636546] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab93ab9b-6695-4488-b4c3-5f31ad1d2165 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.668522] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ae54aac-6f13-4424-906f-03399e872fe4 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.676601] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18f95305-4594-411d-8458-183c65f321f3 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.694388] env[59659]: DEBUG nova.compute.provider_tree [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 742.706777] env[59659]: DEBUG nova.scheduler.client.report [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 742.719406] env[59659]: DEBUG nova.network.neutron [-] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 742.726123] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.334s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 742.726542] env[59659]: DEBUG nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 742.730618] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquiring lock "78ed17da-e8e8-4872-b1bf-95c4e77de8e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.731010] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Lock "78ed17da-e8e8-4872-b1bf-95c4e77de8e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 742.732335] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.317s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 742.735778] env[59659]: INFO nova.compute.claims [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 742.736570] env[59659]: DEBUG nova.network.neutron [-] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 742.743736] env[59659]: DEBUG nova.compute.manager [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 742.773808] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquiring lock "b8404801-b787-4db2-aa13-320f87ca5ac5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.780370] env[59659]: DEBUG nova.compute.utils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 742.783260] env[59659]: DEBUG nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 742.783425] env[59659]: DEBUG nova.network.neutron [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 742.785489] env[59659]: INFO nova.compute.manager [-] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Took 0.16 seconds to deallocate network for instance. [ 742.788816] env[59659]: DEBUG nova.compute.claims [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 742.788990] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.806344] env[59659]: DEBUG nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 742.825594] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.894537] env[59659]: DEBUG nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 742.945050] env[59659]: DEBUG nova.virt.hardware [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 742.945050] env[59659]: DEBUG nova.virt.hardware [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 742.945050] env[59659]: DEBUG nova.virt.hardware [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 742.947294] env[59659]: DEBUG nova.virt.hardware [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 742.947294] env[59659]: DEBUG nova.virt.hardware [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 742.947294] env[59659]: DEBUG nova.virt.hardware [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 742.947294] env[59659]: DEBUG nova.virt.hardware [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 742.947294] env[59659]: DEBUG nova.virt.hardware [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 742.947786] env[59659]: DEBUG nova.virt.hardware [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 742.947786] env[59659]: DEBUG nova.virt.hardware [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 742.947786] env[59659]: DEBUG nova.virt.hardware [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 742.950727] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58522582-fc35-446d-958d-df2aef37f411 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.963249] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a01dde21-bef8-4f39-b10c-4cc452d03f64 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.971017] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e2aed0b-f383-4f2f-9484-0abe4b6620f9 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.985863] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a97a552-cbd4-4a6a-9627-01d326aafb43 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.017106] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c76246d-11a5-4522-8eed-d02782638ed9 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.025255] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d203f772-1c02-4fe2-9bd5-270df09f3546 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.039817] env[59659]: DEBUG nova.compute.provider_tree [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 743.052620] env[59659]: DEBUG nova.scheduler.client.report [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 743.079961] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.347s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.080517] env[59659]: DEBUG nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 743.086279] env[59659]: DEBUG nova.network.neutron [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Successfully created port: b6e44e9f-1e71-4450-80a0-4d9203c470e9 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 743.089520] env[59659]: DEBUG nova.policy [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4432ca36538f4b16b4cf842c7e286271', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7a9e45c3a1d148e7b494e0291dae92bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 743.090867] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.302s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.145496] env[59659]: DEBUG nova.compute.utils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 743.149263] env[59659]: DEBUG nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 743.149441] env[59659]: DEBUG nova.network.neutron [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 743.171721] env[59659]: DEBUG nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 743.269432] env[59659]: DEBUG nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 743.302727] env[59659]: DEBUG nova.virt.hardware [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 743.302988] env[59659]: DEBUG nova.virt.hardware [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 743.303160] env[59659]: DEBUG nova.virt.hardware [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 743.303333] env[59659]: DEBUG nova.virt.hardware [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 743.303467] env[59659]: DEBUG nova.virt.hardware [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 743.303606] env[59659]: DEBUG nova.virt.hardware [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 743.303831] env[59659]: DEBUG nova.virt.hardware [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 743.303979] env[59659]: DEBUG nova.virt.hardware [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 743.304148] env[59659]: DEBUG nova.virt.hardware [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 743.304300] env[59659]: DEBUG nova.virt.hardware [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 743.304459] env[59659]: DEBUG nova.virt.hardware [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 743.305362] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86379fe5-5028-444c-a901-4f8a3b2846b2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.314703] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43ef925d-2ea4-47b0-920d-045506fd2ad2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.338794] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2787299c-f03d-491d-b209-f0defc1e9f6a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.346382] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4a45fbd-ab6b-452f-9202-56dfc6e25879 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.379608] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb3be2d0-feae-49c8-9b41-3a0bc9739e96 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.386544] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f764c82d-44e1-4039-9958-5cecbe3d12bd {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.400146] env[59659]: DEBUG nova.compute.provider_tree [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 743.409783] env[59659]: DEBUG nova.scheduler.client.report [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 743.438493] env[59659]: DEBUG nova.network.neutron [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Successfully created port: b97ea0a9-00ea-409a-ba57-0c68616a1f0a {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 743.449664] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.359s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.450313] env[59659]: ERROR nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 48ed80ba-0203-40d7-adfe-9f69bcfe2e45, please check neutron logs for more information. [ 743.450313] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Traceback (most recent call last): [ 743.450313] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 743.450313] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] self.driver.spawn(context, instance, image_meta, [ 743.450313] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 743.450313] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 743.450313] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 743.450313] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] vm_ref = self.build_virtual_machine(instance, [ 743.450313] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 743.450313] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] vif_infos = vmwarevif.get_vif_info(self._session, [ 743.450313] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 743.451401] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] for vif in network_info: [ 743.451401] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 743.451401] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] return self._sync_wrapper(fn, *args, **kwargs) [ 743.451401] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 743.451401] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] self.wait() [ 743.451401] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 743.451401] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] self[:] = self._gt.wait() [ 743.451401] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 743.451401] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] return self._exit_event.wait() [ 743.451401] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 743.451401] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] result = hub.switch() [ 743.451401] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 743.451401] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] return self.greenlet.switch() [ 743.451401] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 743.451790] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] result = function(*args, **kwargs) [ 743.451790] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 743.451790] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] return func(*args, **kwargs) [ 743.451790] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 743.451790] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] raise e [ 743.451790] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 743.451790] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] nwinfo = self.network_api.allocate_for_instance( [ 743.451790] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 743.451790] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] created_port_ids = self._update_ports_for_instance( [ 743.451790] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 743.451790] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] with excutils.save_and_reraise_exception(): [ 743.451790] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 743.451790] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] self.force_reraise() [ 743.452230] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 743.452230] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] raise self.value [ 743.452230] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 743.452230] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] updated_port = self._update_port( [ 743.452230] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 743.452230] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] _ensure_no_port_binding_failure(port) [ 743.452230] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 743.452230] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] raise exception.PortBindingFailed(port_id=port['id']) [ 743.452230] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] nova.exception.PortBindingFailed: Binding failed for port 48ed80ba-0203-40d7-adfe-9f69bcfe2e45, please check neutron logs for more information. [ 743.452230] env[59659]: ERROR nova.compute.manager [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] [ 743.452230] env[59659]: DEBUG nova.compute.utils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Binding failed for port 48ed80ba-0203-40d7-adfe-9f69bcfe2e45, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 743.454798] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.627s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.454798] env[59659]: INFO nova.compute.claims [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 743.460463] env[59659]: DEBUG nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Build of instance e90ee443-efe0-4f3e-999b-b9376e41fcb5 was re-scheduled: Binding failed for port 48ed80ba-0203-40d7-adfe-9f69bcfe2e45, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 743.462854] env[59659]: DEBUG nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 743.462854] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Acquiring lock "refresh_cache-e90ee443-efe0-4f3e-999b-b9376e41fcb5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 743.462854] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Acquired lock "refresh_cache-e90ee443-efe0-4f3e-999b-b9376e41fcb5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 743.462854] env[59659]: DEBUG nova.network.neutron [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 743.653445] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a839183e-47af-45cf-893e-cc490bf90e5e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.658998] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f6a9b9a-49f0-4b78-adf1-34d0deb93383 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.696998] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf85aaff-cfb4-4546-a615-3c60bd52b50e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.703555] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a7c3f1e-c576-4b6f-82eb-16a14247c236 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.718601] env[59659]: DEBUG nova.compute.provider_tree [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 743.728620] env[59659]: DEBUG nova.scheduler.client.report [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 743.743482] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.291s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.746524] env[59659]: DEBUG nova.compute.manager [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 743.754573] env[59659]: DEBUG nova.policy [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7c2b5f71e3034e5f90220c5ebf1bb6d9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dd2add382fb34e309cc9b0acd9403ef6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 743.763687] env[59659]: DEBUG nova.network.neutron [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 743.782448] env[59659]: DEBUG nova.compute.utils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 743.787260] env[59659]: DEBUG nova.compute.manager [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Not allocating networking since 'none' was specified. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 743.792806] env[59659]: DEBUG nova.compute.manager [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 743.862687] env[59659]: DEBUG nova.compute.manager [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 743.887690] env[59659]: DEBUG nova.virt.hardware [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 743.888071] env[59659]: DEBUG nova.virt.hardware [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 743.888165] env[59659]: DEBUG nova.virt.hardware [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 743.888334] env[59659]: DEBUG nova.virt.hardware [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 743.888476] env[59659]: DEBUG nova.virt.hardware [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 743.888617] env[59659]: DEBUG nova.virt.hardware [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 743.888817] env[59659]: DEBUG nova.virt.hardware [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 743.888987] env[59659]: DEBUG nova.virt.hardware [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 743.889336] env[59659]: DEBUG nova.virt.hardware [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 743.889542] env[59659]: DEBUG nova.virt.hardware [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 743.889748] env[59659]: DEBUG nova.virt.hardware [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 743.890705] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9518050-aeed-4925-813c-6cff5f0ae710 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.899096] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b0b93c9-3e8d-43bd-84eb-bd6dcfdb4a4a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.913251] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Instance VIF info [] {{(pid=59659) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 743.919167] env[59659]: DEBUG nova.virt.vmwareapi.vm_util [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Creating folder: Project (352b723cfbb34bfa9e4500f104c508f1). Parent ref: group-v293946. {{(pid=59659) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 743.919471] env[59659]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6f263af9-ab84-4974-86a4-9ca52f77ff32 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.928988] env[59659]: INFO nova.virt.vmwareapi.vm_util [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Created folder: Project (352b723cfbb34bfa9e4500f104c508f1) in parent group-v293946. [ 743.929192] env[59659]: DEBUG nova.virt.vmwareapi.vm_util [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Creating folder: Instances. Parent ref: group-v293957. {{(pid=59659) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 743.929416] env[59659]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dc412317-bda2-44eb-b7dd-f61f99ec1a98 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.938186] env[59659]: INFO nova.virt.vmwareapi.vm_util [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Created folder: Instances in parent group-v293957. [ 743.938420] env[59659]: DEBUG oslo.service.loopingcall [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 743.938606] env[59659]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Creating VM on the ESX host {{(pid=59659) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 743.938797] env[59659]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1380e432-4c2c-41ec-9454-9a4325c2574d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.956845] env[59659]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 743.956845] env[59659]: value = "task-1384547" [ 743.956845] env[59659]: _type = "Task" [ 743.956845] env[59659]: } to complete. {{(pid=59659) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 743.970796] env[59659]: DEBUG oslo_vmware.api [-] Task: {'id': task-1384547, 'name': CreateVM_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 744.422553] env[59659]: DEBUG nova.network.neutron [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.435434] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Releasing lock "refresh_cache-e90ee443-efe0-4f3e-999b-b9376e41fcb5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 744.438983] env[59659]: DEBUG nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 744.438983] env[59659]: DEBUG nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 744.438983] env[59659]: DEBUG nova.network.neutron [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 744.468664] env[59659]: DEBUG oslo_vmware.api [-] Task: {'id': task-1384547, 'name': CreateVM_Task, 'duration_secs': 0.246087} completed successfully. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 744.468888] env[59659]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Created VM on the ESX host {{(pid=59659) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 744.470033] env[59659]: DEBUG oslo_vmware.service [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bed56e1-6d0b-4a71-a6fa-d2c5b7e86496 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.481292] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 744.481511] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 744.482016] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 744.482595] env[59659]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8e95c009-d065-4184-9076-ab0773e4edca {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.487461] env[59659]: DEBUG oslo_vmware.api [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Waiting for the task: (returnval){ [ 744.487461] env[59659]: value = "session[522b4e95-5a6c-8a31-9bee-9871c53e7a49]52e22aa3-0d70-ded2-f972-6bdcb5f592c0" [ 744.487461] env[59659]: _type = "Task" [ 744.487461] env[59659]: } to complete. {{(pid=59659) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 744.498039] env[59659]: DEBUG oslo_vmware.api [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Task: {'id': session[522b4e95-5a6c-8a31-9bee-9871c53e7a49]52e22aa3-0d70-ded2-f972-6bdcb5f592c0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 744.528988] env[59659]: DEBUG nova.network.neutron [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 744.536849] env[59659]: DEBUG nova.network.neutron [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 744.552056] env[59659]: INFO nova.compute.manager [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] [instance: e90ee443-efe0-4f3e-999b-b9376e41fcb5] Took 0.11 seconds to deallocate network for instance. [ 744.667395] env[59659]: INFO nova.scheduler.client.report [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Deleted allocations for instance e90ee443-efe0-4f3e-999b-b9376e41fcb5 [ 744.685067] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a507a8ca-f2f0-480c-9b31-211a8ebd4d11 tempest-ServersTestJSON-1603309010 tempest-ServersTestJSON-1603309010-project-member] Lock "e90ee443-efe0-4f3e-999b-b9376e41fcb5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.581s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 745.002937] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 745.003937] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Processing image 0fa786c9-f55e-46dc-b725-aa456ca9ff53 {{(pid=59659) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 745.006166] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53/0fa786c9-f55e-46dc-b725-aa456ca9ff53.vmdk" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 745.006166] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53/0fa786c9-f55e-46dc-b725-aa456ca9ff53.vmdk" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 745.006166] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59659) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 745.006166] env[59659]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b3e30ed1-9133-4a5d-a940-1e3894dafe43 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.017612] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59659) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 745.020658] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59659) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 745.021704] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38c3a671-1032-4084-b1b9-2c8b97ca02c2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.030633] env[59659]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5b4572f7-e5fe-4ec8-83b3-fd4d03800f1a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.037244] env[59659]: DEBUG oslo_vmware.api [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Waiting for the task: (returnval){ [ 745.037244] env[59659]: value = "session[522b4e95-5a6c-8a31-9bee-9871c53e7a49]52f9e700-fcb1-100b-58e3-6af919d87782" [ 745.037244] env[59659]: _type = "Task" [ 745.037244] env[59659]: } to complete. {{(pid=59659) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 745.044838] env[59659]: DEBUG oslo_vmware.api [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Task: {'id': session[522b4e95-5a6c-8a31-9bee-9871c53e7a49]52f9e700-fcb1-100b-58e3-6af919d87782, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 745.436572] env[59659]: DEBUG nova.network.neutron [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Successfully created port: 54e48d56-6f7b-4965-aed4-4439616ed0bf {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 745.548537] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Preparing fetch location {{(pid=59659) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 745.548790] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Creating directory with path [datastore1] vmware_temp/4d9e4f65-b9e9-4426-ac23-2a2e2760d555/0fa786c9-f55e-46dc-b725-aa456ca9ff53 {{(pid=59659) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 745.549047] env[59659]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-305c3ac1-efd6-4ed0-af1c-a97af2cec987 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.575063] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Created directory with path [datastore1] vmware_temp/4d9e4f65-b9e9-4426-ac23-2a2e2760d555/0fa786c9-f55e-46dc-b725-aa456ca9ff53 {{(pid=59659) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 745.575277] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Fetch image to [datastore1] vmware_temp/4d9e4f65-b9e9-4426-ac23-2a2e2760d555/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk {{(pid=59659) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 745.575443] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Downloading image file data 0fa786c9-f55e-46dc-b725-aa456ca9ff53 to [datastore1] vmware_temp/4d9e4f65-b9e9-4426-ac23-2a2e2760d555/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59659) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 745.576267] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e27853c7-7174-463f-8a96-dc7b8af607ef {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.583489] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9368b0b6-49d4-4309-807d-55985adfce7e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.593493] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66f839f4-33e3-4868-99a3-739f8bc049ba {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.627891] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd576488-dbb3-4bea-b561-fe95c679e2b4 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.634875] env[59659]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-55924f77-c384-40ad-b3b5-6f7743eb611e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.723427] env[59659]: DEBUG nova.virt.vmwareapi.images [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Downloading image file data 0fa786c9-f55e-46dc-b725-aa456ca9ff53 to the data store datastore1 {{(pid=59659) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 745.810901] env[59659]: DEBUG oslo_vmware.rw_handles [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4d9e4f65-b9e9-4426-ac23-2a2e2760d555/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59659) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 745.875701] env[59659]: DEBUG oslo_vmware.rw_handles [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Completed reading data from the image iterator. {{(pid=59659) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 745.875898] env[59659]: DEBUG oslo_vmware.rw_handles [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4d9e4f65-b9e9-4426-ac23-2a2e2760d555/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59659) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 746.190895] env[59659]: DEBUG nova.network.neutron [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Successfully created port: 96439e30-308b-4bba-82b1-71b05db60ec7 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 746.696281] env[59659]: DEBUG oslo_concurrency.lockutils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquiring lock "abec9f87-4cde-4b5e-ad2a-fa682842ac7a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 746.696462] env[59659]: DEBUG oslo_concurrency.lockutils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "abec9f87-4cde-4b5e-ad2a-fa682842ac7a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 746.709383] env[59659]: DEBUG nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 746.781323] env[59659]: DEBUG oslo_concurrency.lockutils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 746.781323] env[59659]: DEBUG oslo_concurrency.lockutils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 746.781323] env[59659]: INFO nova.compute.claims [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 746.930633] env[59659]: ERROR nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 45008a60-2dcb-40dd-b9e0-f4d0e6137628, please check neutron logs for more information. [ 746.930633] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 746.930633] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 746.930633] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 746.930633] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 746.930633] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 746.930633] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 746.930633] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 746.930633] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 746.930633] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 746.930633] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 746.930633] env[59659]: ERROR nova.compute.manager raise self.value [ 746.930633] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 746.930633] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 746.930633] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 746.930633] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 746.931158] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 746.931158] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 746.931158] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 45008a60-2dcb-40dd-b9e0-f4d0e6137628, please check neutron logs for more information. [ 746.931158] env[59659]: ERROR nova.compute.manager [ 746.931158] env[59659]: Traceback (most recent call last): [ 746.931158] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 746.931158] env[59659]: listener.cb(fileno) [ 746.931158] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 746.931158] env[59659]: result = function(*args, **kwargs) [ 746.931158] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 746.931158] env[59659]: return func(*args, **kwargs) [ 746.931158] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 746.931158] env[59659]: raise e [ 746.931158] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 746.931158] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 746.931158] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 746.931158] env[59659]: created_port_ids = self._update_ports_for_instance( [ 746.931158] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 746.931158] env[59659]: with excutils.save_and_reraise_exception(): [ 746.931158] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 746.931158] env[59659]: self.force_reraise() [ 746.931158] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 746.931158] env[59659]: raise self.value [ 746.931158] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 746.931158] env[59659]: updated_port = self._update_port( [ 746.931158] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 746.931158] env[59659]: _ensure_no_port_binding_failure(port) [ 746.931158] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 746.931158] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 746.932042] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 45008a60-2dcb-40dd-b9e0-f4d0e6137628, please check neutron logs for more information. [ 746.932042] env[59659]: Removing descriptor: 14 [ 746.932042] env[59659]: ERROR nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 45008a60-2dcb-40dd-b9e0-f4d0e6137628, please check neutron logs for more information. [ 746.932042] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Traceback (most recent call last): [ 746.932042] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 746.932042] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] yield resources [ 746.932042] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 746.932042] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] self.driver.spawn(context, instance, image_meta, [ 746.932042] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 746.932042] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 746.932042] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 746.932042] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] vm_ref = self.build_virtual_machine(instance, [ 746.932363] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 746.932363] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] vif_infos = vmwarevif.get_vif_info(self._session, [ 746.932363] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 746.932363] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] for vif in network_info: [ 746.932363] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 746.932363] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] return self._sync_wrapper(fn, *args, **kwargs) [ 746.932363] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 746.932363] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] self.wait() [ 746.932363] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 746.932363] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] self[:] = self._gt.wait() [ 746.932363] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 746.932363] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] return self._exit_event.wait() [ 746.932363] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 746.932363] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] result = hub.switch() [ 746.932727] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 746.932727] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] return self.greenlet.switch() [ 746.932727] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 746.932727] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] result = function(*args, **kwargs) [ 746.932727] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 746.932727] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] return func(*args, **kwargs) [ 746.932727] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 746.932727] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] raise e [ 746.932727] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 746.932727] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] nwinfo = self.network_api.allocate_for_instance( [ 746.932727] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 746.932727] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] created_port_ids = self._update_ports_for_instance( [ 746.932727] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 746.933852] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] with excutils.save_and_reraise_exception(): [ 746.933852] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 746.933852] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] self.force_reraise() [ 746.933852] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 746.933852] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] raise self.value [ 746.933852] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 746.933852] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] updated_port = self._update_port( [ 746.933852] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 746.933852] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] _ensure_no_port_binding_failure(port) [ 746.933852] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 746.933852] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] raise exception.PortBindingFailed(port_id=port['id']) [ 746.933852] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] nova.exception.PortBindingFailed: Binding failed for port 45008a60-2dcb-40dd-b9e0-f4d0e6137628, please check neutron logs for more information. [ 746.933852] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] [ 746.934258] env[59659]: INFO nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Terminating instance [ 746.934258] env[59659]: DEBUG oslo_concurrency.lockutils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Acquiring lock "refresh_cache-508c2a14-5f5b-4968-843a-1378d1c46e2f" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 746.934258] env[59659]: DEBUG oslo_concurrency.lockutils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Acquired lock "refresh_cache-508c2a14-5f5b-4968-843a-1378d1c46e2f" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 746.934359] env[59659]: DEBUG nova.network.neutron [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 746.974354] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e08916e-9501-4ff5-85f8-58c9e88fff06 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.982831] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8333c30f-b0da-43d0-adb2-b89d7b2f0883 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.017272] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b712b95-b0eb-4429-85c2-11ae07d3a771 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.026515] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41a53d47-79c1-4bc7-b670-df35f5f9c88f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.041844] env[59659]: DEBUG nova.compute.provider_tree [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 747.050736] env[59659]: DEBUG nova.scheduler.client.report [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 747.070667] env[59659]: DEBUG oslo_concurrency.lockutils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 747.071176] env[59659]: DEBUG nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 747.107243] env[59659]: DEBUG nova.compute.utils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 747.107788] env[59659]: DEBUG nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 747.108124] env[59659]: DEBUG nova.network.neutron [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 747.120101] env[59659]: DEBUG nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 747.195528] env[59659]: DEBUG nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 747.217163] env[59659]: DEBUG nova.virt.hardware [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 747.217163] env[59659]: DEBUG nova.virt.hardware [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 747.217163] env[59659]: DEBUG nova.virt.hardware [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 747.217314] env[59659]: DEBUG nova.virt.hardware [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 747.217468] env[59659]: DEBUG nova.virt.hardware [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 747.217547] env[59659]: DEBUG nova.virt.hardware [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 747.217743] env[59659]: DEBUG nova.virt.hardware [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 747.217890] env[59659]: DEBUG nova.virt.hardware [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 747.218057] env[59659]: DEBUG nova.virt.hardware [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 747.218216] env[59659]: DEBUG nova.virt.hardware [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 747.218382] env[59659]: DEBUG nova.virt.hardware [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 747.219251] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cccd3bab-2989-4d59-864a-62a019fc8f6d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.228783] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-053c8244-9cce-4ba1-981a-d3f2e6502c7f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.253024] env[59659]: DEBUG nova.network.neutron [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 747.496932] env[59659]: DEBUG nova.policy [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '593629524d524ad9b515d92b36e7b1e8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '91232108c8944a3da00233e9c54c9749', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 747.720347] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquiring lock "a75a3491-94b0-4754-8e42-7bf49194a022" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 747.720567] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Lock "a75a3491-94b0-4754-8e42-7bf49194a022" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 747.730866] env[59659]: DEBUG nova.compute.manager [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 747.783256] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 747.783504] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 747.785031] env[59659]: INFO nova.compute.claims [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 747.908988] env[59659]: DEBUG nova.compute.manager [req-e7d05ded-96a2-4091-a523-d1448afe1b56 req-2627cf23-1b3d-4bf3-aa49-e194ce8d201b service nova] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Received event network-changed-45008a60-2dcb-40dd-b9e0-f4d0e6137628 {{(pid=59659) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 747.909190] env[59659]: DEBUG nova.compute.manager [req-e7d05ded-96a2-4091-a523-d1448afe1b56 req-2627cf23-1b3d-4bf3-aa49-e194ce8d201b service nova] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Refreshing instance network info cache due to event network-changed-45008a60-2dcb-40dd-b9e0-f4d0e6137628. {{(pid=59659) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 747.909369] env[59659]: DEBUG oslo_concurrency.lockutils [req-e7d05ded-96a2-4091-a523-d1448afe1b56 req-2627cf23-1b3d-4bf3-aa49-e194ce8d201b service nova] Acquiring lock "refresh_cache-508c2a14-5f5b-4968-843a-1378d1c46e2f" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 747.985481] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b29931ca-b836-46ac-a972-23eb3b15ca9a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.994530] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0d0c693-3289-4575-98d2-b00477283614 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.998495] env[59659]: DEBUG nova.network.neutron [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 748.029246] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-961a2311-d458-4cca-9259-eb2fb034f912 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.033637] env[59659]: DEBUG oslo_concurrency.lockutils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Releasing lock "refresh_cache-508c2a14-5f5b-4968-843a-1378d1c46e2f" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 748.034077] env[59659]: DEBUG nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 748.034264] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 748.034861] env[59659]: DEBUG oslo_concurrency.lockutils [req-e7d05ded-96a2-4091-a523-d1448afe1b56 req-2627cf23-1b3d-4bf3-aa49-e194ce8d201b service nova] Acquired lock "refresh_cache-508c2a14-5f5b-4968-843a-1378d1c46e2f" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 748.035049] env[59659]: DEBUG nova.network.neutron [req-e7d05ded-96a2-4091-a523-d1448afe1b56 req-2627cf23-1b3d-4bf3-aa49-e194ce8d201b service nova] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Refreshing network info cache for port 45008a60-2dcb-40dd-b9e0-f4d0e6137628 {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 748.036029] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b8bb92e7-997c-4ac2-a92e-16d790a2cbf4 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.047894] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-984f7a8f-1b1a-4f77-a40b-b85dc6001809 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.054766] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b05c881c-f947-4bdf-b807-2d50c5f39461 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.079917] env[59659]: DEBUG nova.compute.provider_tree [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 748.087302] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 508c2a14-5f5b-4968-843a-1378d1c46e2f could not be found. [ 748.087523] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 748.087696] env[59659]: INFO nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Took 0.05 seconds to destroy the instance on the hypervisor. [ 748.087932] env[59659]: DEBUG oslo.service.loopingcall [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 748.088612] env[59659]: DEBUG nova.compute.manager [-] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 748.088711] env[59659]: DEBUG nova.network.neutron [-] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 748.091495] env[59659]: DEBUG nova.scheduler.client.report [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 748.104846] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.321s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 748.105476] env[59659]: DEBUG nova.compute.manager [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 748.140137] env[59659]: DEBUG nova.compute.utils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 748.141380] env[59659]: DEBUG nova.compute.manager [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Not allocating networking since 'none' was specified. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 748.151981] env[59659]: DEBUG nova.compute.manager [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 748.219512] env[59659]: DEBUG nova.compute.manager [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 748.245031] env[59659]: DEBUG nova.virt.hardware [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 748.245031] env[59659]: DEBUG nova.virt.hardware [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 748.245031] env[59659]: DEBUG nova.virt.hardware [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 748.245247] env[59659]: DEBUG nova.virt.hardware [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 748.245247] env[59659]: DEBUG nova.virt.hardware [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 748.245247] env[59659]: DEBUG nova.virt.hardware [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 748.245247] env[59659]: DEBUG nova.virt.hardware [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 748.245608] env[59659]: DEBUG nova.virt.hardware [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 748.245895] env[59659]: DEBUG nova.virt.hardware [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 748.246214] env[59659]: DEBUG nova.virt.hardware [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 748.246506] env[59659]: DEBUG nova.virt.hardware [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 748.247461] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b847fb97-44f6-4407-8676-0eec6bb55308 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.256735] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b04f534d-f6eb-4c8a-ab99-35f692f1d2fe {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.274239] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Instance VIF info [] {{(pid=59659) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 748.278928] env[59659]: DEBUG oslo.service.loopingcall [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 748.279517] env[59659]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Creating VM on the ESX host {{(pid=59659) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 748.279862] env[59659]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-78e854cc-29d9-4f48-8b66-dadc72f03d47 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.302154] env[59659]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 748.302154] env[59659]: value = "task-1384550" [ 748.302154] env[59659]: _type = "Task" [ 748.302154] env[59659]: } to complete. {{(pid=59659) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 748.315155] env[59659]: DEBUG oslo_vmware.api [-] Task: {'id': task-1384550, 'name': CreateVM_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 748.405477] env[59659]: DEBUG nova.network.neutron [req-e7d05ded-96a2-4091-a523-d1448afe1b56 req-2627cf23-1b3d-4bf3-aa49-e194ce8d201b service nova] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 748.493660] env[59659]: DEBUG nova.network.neutron [-] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 748.505610] env[59659]: DEBUG nova.network.neutron [-] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 748.521942] env[59659]: INFO nova.compute.manager [-] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Took 0.43 seconds to deallocate network for instance. [ 748.524175] env[59659]: DEBUG nova.compute.claims [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 748.524528] env[59659]: DEBUG oslo_concurrency.lockutils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 748.524829] env[59659]: DEBUG oslo_concurrency.lockutils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 748.806797] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64ad851b-6f4d-44ba-bd04-a876a9f7ec86 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.823209] env[59659]: DEBUG oslo_vmware.api [-] Task: {'id': task-1384550, 'name': CreateVM_Task, 'duration_secs': 0.268952} completed successfully. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 748.824914] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80ecb0a1-0178-4944-8060-a2333457a6d7 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.831813] env[59659]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Created VM on the ESX host {{(pid=59659) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 748.832291] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 748.832442] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 748.832983] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 748.833467] env[59659]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c9d3bc59-e791-448b-8080-517ee81782f1 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.863202] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-513bf875-0645-4df0-a8a3-701aa61dbc72 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.865896] env[59659]: DEBUG oslo_vmware.api [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Waiting for the task: (returnval){ [ 748.865896] env[59659]: value = "session[522b4e95-5a6c-8a31-9bee-9871c53e7a49]52a52d19-0fb5-812e-3167-33418740dda2" [ 748.865896] env[59659]: _type = "Task" [ 748.865896] env[59659]: } to complete. {{(pid=59659) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 748.873102] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbb13b6f-dc3f-4219-b6c7-18e2f8433d6f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.883223] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 748.883470] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Processing image 0fa786c9-f55e-46dc-b725-aa456ca9ff53 {{(pid=59659) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 748.883714] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53/0fa786c9-f55e-46dc-b725-aa456ca9ff53.vmdk" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 748.892603] env[59659]: DEBUG nova.compute.provider_tree [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 748.905132] env[59659]: DEBUG nova.scheduler.client.report [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 748.922609] env[59659]: DEBUG oslo_concurrency.lockutils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.398s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 748.923367] env[59659]: ERROR nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 45008a60-2dcb-40dd-b9e0-f4d0e6137628, please check neutron logs for more information. [ 748.923367] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Traceback (most recent call last): [ 748.923367] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 748.923367] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] self.driver.spawn(context, instance, image_meta, [ 748.923367] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 748.923367] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 748.923367] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 748.923367] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] vm_ref = self.build_virtual_machine(instance, [ 748.923367] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 748.923367] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] vif_infos = vmwarevif.get_vif_info(self._session, [ 748.923367] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 748.923992] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] for vif in network_info: [ 748.923992] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 748.923992] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] return self._sync_wrapper(fn, *args, **kwargs) [ 748.923992] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 748.923992] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] self.wait() [ 748.923992] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 748.923992] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] self[:] = self._gt.wait() [ 748.923992] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 748.923992] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] return self._exit_event.wait() [ 748.923992] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 748.923992] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] result = hub.switch() [ 748.923992] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 748.923992] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] return self.greenlet.switch() [ 748.923992] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 748.925617] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] result = function(*args, **kwargs) [ 748.925617] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 748.925617] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] return func(*args, **kwargs) [ 748.925617] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 748.925617] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] raise e [ 748.925617] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 748.925617] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] nwinfo = self.network_api.allocate_for_instance( [ 748.925617] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 748.925617] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] created_port_ids = self._update_ports_for_instance( [ 748.925617] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 748.925617] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] with excutils.save_and_reraise_exception(): [ 748.925617] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 748.925617] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] self.force_reraise() [ 748.926302] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 748.926302] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] raise self.value [ 748.926302] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 748.926302] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] updated_port = self._update_port( [ 748.926302] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 748.926302] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] _ensure_no_port_binding_failure(port) [ 748.926302] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 748.926302] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] raise exception.PortBindingFailed(port_id=port['id']) [ 748.926302] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] nova.exception.PortBindingFailed: Binding failed for port 45008a60-2dcb-40dd-b9e0-f4d0e6137628, please check neutron logs for more information. [ 748.926302] env[59659]: ERROR nova.compute.manager [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] [ 748.926302] env[59659]: DEBUG nova.compute.utils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Binding failed for port 45008a60-2dcb-40dd-b9e0-f4d0e6137628, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 748.926799] env[59659]: DEBUG nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Build of instance 508c2a14-5f5b-4968-843a-1378d1c46e2f was re-scheduled: Binding failed for port 45008a60-2dcb-40dd-b9e0-f4d0e6137628, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 748.926799] env[59659]: DEBUG nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 748.926799] env[59659]: DEBUG oslo_concurrency.lockutils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Acquiring lock "refresh_cache-508c2a14-5f5b-4968-843a-1378d1c46e2f" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 749.174056] env[59659]: DEBUG nova.network.neutron [req-e7d05ded-96a2-4091-a523-d1448afe1b56 req-2627cf23-1b3d-4bf3-aa49-e194ce8d201b service nova] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 749.184034] env[59659]: DEBUG oslo_concurrency.lockutils [req-e7d05ded-96a2-4091-a523-d1448afe1b56 req-2627cf23-1b3d-4bf3-aa49-e194ce8d201b service nova] Releasing lock "refresh_cache-508c2a14-5f5b-4968-843a-1378d1c46e2f" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 749.184594] env[59659]: DEBUG oslo_concurrency.lockutils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Acquired lock "refresh_cache-508c2a14-5f5b-4968-843a-1378d1c46e2f" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 749.184594] env[59659]: DEBUG nova.network.neutron [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 749.279143] env[59659]: DEBUG nova.network.neutron [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 750.165841] env[59659]: DEBUG nova.network.neutron [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 750.178684] env[59659]: DEBUG oslo_concurrency.lockutils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Releasing lock "refresh_cache-508c2a14-5f5b-4968-843a-1378d1c46e2f" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 750.178684] env[59659]: DEBUG nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 750.178684] env[59659]: DEBUG nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 750.178684] env[59659]: DEBUG nova.network.neutron [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 750.277458] env[59659]: DEBUG nova.network.neutron [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 750.286767] env[59659]: DEBUG nova.network.neutron [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 750.299018] env[59659]: INFO nova.compute.manager [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] [instance: 508c2a14-5f5b-4968-843a-1378d1c46e2f] Took 0.12 seconds to deallocate network for instance. [ 750.342696] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquiring lock "abec9f87-4cde-4b5e-ad2a-fa682842ac7a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 750.404639] env[59659]: DEBUG nova.compute.manager [req-682e16ab-0dbc-47b5-a917-a174c764cf4e req-f891ff7c-ea6a-4b98-8729-54016615eeb2 service nova] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Received event network-changed-b6e44e9f-1e71-4450-80a0-4d9203c470e9 {{(pid=59659) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 750.404930] env[59659]: DEBUG nova.compute.manager [req-682e16ab-0dbc-47b5-a917-a174c764cf4e req-f891ff7c-ea6a-4b98-8729-54016615eeb2 service nova] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Refreshing instance network info cache due to event network-changed-b6e44e9f-1e71-4450-80a0-4d9203c470e9. {{(pid=59659) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 750.406360] env[59659]: DEBUG oslo_concurrency.lockutils [req-682e16ab-0dbc-47b5-a917-a174c764cf4e req-f891ff7c-ea6a-4b98-8729-54016615eeb2 service nova] Acquiring lock "refresh_cache-b8404801-b787-4db2-aa13-320f87ca5ac5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 750.406360] env[59659]: DEBUG oslo_concurrency.lockutils [req-682e16ab-0dbc-47b5-a917-a174c764cf4e req-f891ff7c-ea6a-4b98-8729-54016615eeb2 service nova] Acquired lock "refresh_cache-b8404801-b787-4db2-aa13-320f87ca5ac5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 750.406360] env[59659]: DEBUG nova.network.neutron [req-682e16ab-0dbc-47b5-a917-a174c764cf4e req-f891ff7c-ea6a-4b98-8729-54016615eeb2 service nova] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Refreshing network info cache for port b6e44e9f-1e71-4450-80a0-4d9203c470e9 {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 750.421075] env[59659]: INFO nova.scheduler.client.report [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Deleted allocations for instance 508c2a14-5f5b-4968-843a-1378d1c46e2f [ 750.439800] env[59659]: DEBUG oslo_concurrency.lockutils [None req-54d6db5f-fa9c-4c6d-83d6-e41dfdf83fa0 tempest-AttachVolumeTestJSON-1478151191 tempest-AttachVolumeTestJSON-1478151191-project-member] Lock "508c2a14-5f5b-4968-843a-1378d1c46e2f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.777s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 750.451332] env[59659]: DEBUG nova.network.neutron [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Successfully created port: f3f7985f-2ffb-489b-bf50-5a30759b413b {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 750.500122] env[59659]: ERROR nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port b6e44e9f-1e71-4450-80a0-4d9203c470e9, please check neutron logs for more information. [ 750.500122] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 750.500122] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 750.500122] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 750.500122] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 750.500122] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 750.500122] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 750.500122] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 750.500122] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 750.500122] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 750.500122] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 750.500122] env[59659]: ERROR nova.compute.manager raise self.value [ 750.500122] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 750.500122] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 750.500122] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 750.500122] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 750.500641] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 750.500641] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 750.500641] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port b6e44e9f-1e71-4450-80a0-4d9203c470e9, please check neutron logs for more information. [ 750.500641] env[59659]: ERROR nova.compute.manager [ 750.500641] env[59659]: Traceback (most recent call last): [ 750.500641] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 750.500641] env[59659]: listener.cb(fileno) [ 750.500641] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 750.500641] env[59659]: result = function(*args, **kwargs) [ 750.500641] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 750.500641] env[59659]: return func(*args, **kwargs) [ 750.500641] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 750.500641] env[59659]: raise e [ 750.500641] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 750.500641] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 750.500641] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 750.500641] env[59659]: created_port_ids = self._update_ports_for_instance( [ 750.500641] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 750.500641] env[59659]: with excutils.save_and_reraise_exception(): [ 750.500641] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 750.500641] env[59659]: self.force_reraise() [ 750.500641] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 750.500641] env[59659]: raise self.value [ 750.500641] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 750.500641] env[59659]: updated_port = self._update_port( [ 750.500641] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 750.500641] env[59659]: _ensure_no_port_binding_failure(port) [ 750.500641] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 750.500641] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 750.501835] env[59659]: nova.exception.PortBindingFailed: Binding failed for port b6e44e9f-1e71-4450-80a0-4d9203c470e9, please check neutron logs for more information. [ 750.501835] env[59659]: Removing descriptor: 22 [ 750.501835] env[59659]: ERROR nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port b6e44e9f-1e71-4450-80a0-4d9203c470e9, please check neutron logs for more information. [ 750.501835] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Traceback (most recent call last): [ 750.501835] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 750.501835] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] yield resources [ 750.501835] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 750.501835] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] self.driver.spawn(context, instance, image_meta, [ 750.501835] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 750.501835] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 750.501835] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 750.501835] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] vm_ref = self.build_virtual_machine(instance, [ 750.502199] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 750.502199] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] vif_infos = vmwarevif.get_vif_info(self._session, [ 750.502199] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 750.502199] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] for vif in network_info: [ 750.502199] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 750.502199] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] return self._sync_wrapper(fn, *args, **kwargs) [ 750.502199] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 750.502199] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] self.wait() [ 750.502199] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 750.502199] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] self[:] = self._gt.wait() [ 750.502199] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 750.502199] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] return self._exit_event.wait() [ 750.502199] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 750.502199] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] result = hub.switch() [ 750.503659] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 750.503659] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] return self.greenlet.switch() [ 750.503659] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 750.503659] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] result = function(*args, **kwargs) [ 750.503659] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 750.503659] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] return func(*args, **kwargs) [ 750.503659] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 750.503659] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] raise e [ 750.503659] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 750.503659] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] nwinfo = self.network_api.allocate_for_instance( [ 750.503659] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 750.503659] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] created_port_ids = self._update_ports_for_instance( [ 750.503659] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 750.505068] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] with excutils.save_and_reraise_exception(): [ 750.505068] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 750.505068] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] self.force_reraise() [ 750.505068] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 750.505068] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] raise self.value [ 750.505068] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 750.505068] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] updated_port = self._update_port( [ 750.505068] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 750.505068] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] _ensure_no_port_binding_failure(port) [ 750.505068] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 750.505068] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] raise exception.PortBindingFailed(port_id=port['id']) [ 750.505068] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] nova.exception.PortBindingFailed: Binding failed for port b6e44e9f-1e71-4450-80a0-4d9203c470e9, please check neutron logs for more information. [ 750.505068] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] [ 750.505444] env[59659]: INFO nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Terminating instance [ 750.505949] env[59659]: DEBUG nova.network.neutron [req-682e16ab-0dbc-47b5-a917-a174c764cf4e req-f891ff7c-ea6a-4b98-8729-54016615eeb2 service nova] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 750.508542] env[59659]: DEBUG oslo_concurrency.lockutils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquiring lock "refresh_cache-b8404801-b787-4db2-aa13-320f87ca5ac5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 750.969624] env[59659]: ERROR nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port b97ea0a9-00ea-409a-ba57-0c68616a1f0a, please check neutron logs for more information. [ 750.969624] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 750.969624] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 750.969624] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 750.969624] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 750.969624] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 750.969624] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 750.969624] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 750.969624] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 750.969624] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 750.969624] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 750.969624] env[59659]: ERROR nova.compute.manager raise self.value [ 750.969624] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 750.969624] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 750.969624] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 750.969624] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 750.970101] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 750.970101] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 750.970101] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port b97ea0a9-00ea-409a-ba57-0c68616a1f0a, please check neutron logs for more information. [ 750.970101] env[59659]: ERROR nova.compute.manager [ 750.970101] env[59659]: Traceback (most recent call last): [ 750.970101] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 750.970101] env[59659]: listener.cb(fileno) [ 750.970101] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 750.970101] env[59659]: result = function(*args, **kwargs) [ 750.970101] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 750.970101] env[59659]: return func(*args, **kwargs) [ 750.970101] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 750.970101] env[59659]: raise e [ 750.970101] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 750.970101] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 750.970101] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 750.970101] env[59659]: created_port_ids = self._update_ports_for_instance( [ 750.970101] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 750.970101] env[59659]: with excutils.save_and_reraise_exception(): [ 750.970101] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 750.970101] env[59659]: self.force_reraise() [ 750.970101] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 750.970101] env[59659]: raise self.value [ 750.970101] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 750.970101] env[59659]: updated_port = self._update_port( [ 750.970101] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 750.970101] env[59659]: _ensure_no_port_binding_failure(port) [ 750.970101] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 750.970101] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 750.970868] env[59659]: nova.exception.PortBindingFailed: Binding failed for port b97ea0a9-00ea-409a-ba57-0c68616a1f0a, please check neutron logs for more information. [ 750.970868] env[59659]: Removing descriptor: 12 [ 750.970868] env[59659]: ERROR nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port b97ea0a9-00ea-409a-ba57-0c68616a1f0a, please check neutron logs for more information. [ 750.970868] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Traceback (most recent call last): [ 750.970868] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 750.970868] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] yield resources [ 750.970868] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 750.970868] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] self.driver.spawn(context, instance, image_meta, [ 750.970868] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 750.970868] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] self._vmops.spawn(context, instance, image_meta, injected_files, [ 750.970868] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 750.970868] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] vm_ref = self.build_virtual_machine(instance, [ 750.971240] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 750.971240] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] vif_infos = vmwarevif.get_vif_info(self._session, [ 750.971240] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 750.971240] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] for vif in network_info: [ 750.971240] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 750.971240] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] return self._sync_wrapper(fn, *args, **kwargs) [ 750.971240] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 750.971240] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] self.wait() [ 750.971240] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 750.971240] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] self[:] = self._gt.wait() [ 750.971240] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 750.971240] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] return self._exit_event.wait() [ 750.971240] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 750.971240] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] result = hub.switch() [ 750.971637] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 750.971637] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] return self.greenlet.switch() [ 750.971637] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 750.971637] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] result = function(*args, **kwargs) [ 750.971637] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 750.971637] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] return func(*args, **kwargs) [ 750.971637] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 750.971637] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] raise e [ 750.971637] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 750.971637] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] nwinfo = self.network_api.allocate_for_instance( [ 750.971637] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 750.971637] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] created_port_ids = self._update_ports_for_instance( [ 750.971637] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 750.972039] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] with excutils.save_and_reraise_exception(): [ 750.972039] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 750.972039] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] self.force_reraise() [ 750.972039] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 750.972039] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] raise self.value [ 750.972039] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 750.972039] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] updated_port = self._update_port( [ 750.972039] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 750.972039] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] _ensure_no_port_binding_failure(port) [ 750.972039] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 750.972039] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] raise exception.PortBindingFailed(port_id=port['id']) [ 750.972039] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] nova.exception.PortBindingFailed: Binding failed for port b97ea0a9-00ea-409a-ba57-0c68616a1f0a, please check neutron logs for more information. [ 750.972039] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] [ 750.972361] env[59659]: INFO nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Terminating instance [ 750.977646] env[59659]: DEBUG oslo_concurrency.lockutils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Acquiring lock "refresh_cache-e2620c94-5629-4094-a92f-d83d9efd6205" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 750.977803] env[59659]: DEBUG oslo_concurrency.lockutils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Acquired lock "refresh_cache-e2620c94-5629-4094-a92f-d83d9efd6205" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 750.977966] env[59659]: DEBUG nova.network.neutron [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 751.075168] env[59659]: DEBUG nova.network.neutron [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 751.418353] env[59659]: DEBUG nova.network.neutron [req-682e16ab-0dbc-47b5-a917-a174c764cf4e req-f891ff7c-ea6a-4b98-8729-54016615eeb2 service nova] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 751.426662] env[59659]: DEBUG oslo_concurrency.lockutils [req-682e16ab-0dbc-47b5-a917-a174c764cf4e req-f891ff7c-ea6a-4b98-8729-54016615eeb2 service nova] Releasing lock "refresh_cache-b8404801-b787-4db2-aa13-320f87ca5ac5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 751.427149] env[59659]: DEBUG oslo_concurrency.lockutils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquired lock "refresh_cache-b8404801-b787-4db2-aa13-320f87ca5ac5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 751.427397] env[59659]: DEBUG nova.network.neutron [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 751.540782] env[59659]: DEBUG nova.network.neutron [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 751.817243] env[59659]: DEBUG nova.network.neutron [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 751.826774] env[59659]: DEBUG oslo_concurrency.lockutils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Releasing lock "refresh_cache-e2620c94-5629-4094-a92f-d83d9efd6205" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 751.827358] env[59659]: DEBUG nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 751.827462] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 751.828389] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cfc6e1d4-fe0b-4fb2-a995-6d758c3f6f2d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.841986] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cd8725b-a266-4005-a4a5-76a9271c2064 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.871604] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e2620c94-5629-4094-a92f-d83d9efd6205 could not be found. [ 751.871964] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 751.872163] env[59659]: INFO nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Took 0.04 seconds to destroy the instance on the hypervisor. [ 751.872412] env[59659]: DEBUG oslo.service.loopingcall [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 751.872755] env[59659]: DEBUG nova.compute.manager [-] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 751.872755] env[59659]: DEBUG nova.network.neutron [-] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 751.942209] env[59659]: DEBUG nova.network.neutron [-] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 751.954409] env[59659]: DEBUG nova.network.neutron [-] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 751.965085] env[59659]: INFO nova.compute.manager [-] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Took 0.09 seconds to deallocate network for instance. [ 751.967168] env[59659]: DEBUG nova.compute.claims [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 751.967528] env[59659]: DEBUG oslo_concurrency.lockutils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 751.967623] env[59659]: DEBUG oslo_concurrency.lockutils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 752.186774] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f5e8257-0cfa-4d80-a6c4-ce6715839d80 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.195846] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15a71d0c-158d-45c2-acc4-9a481ac4b864 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.237485] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6049ea31-7de9-44d7-9929-1f427624c40e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.246610] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d150b4d-01ac-4960-be1a-ff1843cd30ce {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.263184] env[59659]: DEBUG nova.compute.provider_tree [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 752.271398] env[59659]: DEBUG nova.scheduler.client.report [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 752.284774] env[59659]: DEBUG oslo_concurrency.lockutils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.317s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 752.285397] env[59659]: ERROR nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port b97ea0a9-00ea-409a-ba57-0c68616a1f0a, please check neutron logs for more information. [ 752.285397] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Traceback (most recent call last): [ 752.285397] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 752.285397] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] self.driver.spawn(context, instance, image_meta, [ 752.285397] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 752.285397] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] self._vmops.spawn(context, instance, image_meta, injected_files, [ 752.285397] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 752.285397] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] vm_ref = self.build_virtual_machine(instance, [ 752.285397] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 752.285397] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] vif_infos = vmwarevif.get_vif_info(self._session, [ 752.285397] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 752.285758] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] for vif in network_info: [ 752.285758] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 752.285758] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] return self._sync_wrapper(fn, *args, **kwargs) [ 752.285758] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 752.285758] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] self.wait() [ 752.285758] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 752.285758] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] self[:] = self._gt.wait() [ 752.285758] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 752.285758] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] return self._exit_event.wait() [ 752.285758] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 752.285758] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] result = hub.switch() [ 752.285758] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 752.285758] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] return self.greenlet.switch() [ 752.285758] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 752.286125] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] result = function(*args, **kwargs) [ 752.286125] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 752.286125] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] return func(*args, **kwargs) [ 752.286125] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 752.286125] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] raise e [ 752.286125] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 752.286125] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] nwinfo = self.network_api.allocate_for_instance( [ 752.286125] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 752.286125] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] created_port_ids = self._update_ports_for_instance( [ 752.286125] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 752.286125] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] with excutils.save_and_reraise_exception(): [ 752.286125] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 752.286125] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] self.force_reraise() [ 752.286457] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 752.286457] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] raise self.value [ 752.286457] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 752.286457] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] updated_port = self._update_port( [ 752.286457] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 752.286457] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] _ensure_no_port_binding_failure(port) [ 752.286457] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 752.286457] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] raise exception.PortBindingFailed(port_id=port['id']) [ 752.286457] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] nova.exception.PortBindingFailed: Binding failed for port b97ea0a9-00ea-409a-ba57-0c68616a1f0a, please check neutron logs for more information. [ 752.286457] env[59659]: ERROR nova.compute.manager [instance: e2620c94-5629-4094-a92f-d83d9efd6205] [ 752.286457] env[59659]: DEBUG nova.compute.utils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Binding failed for port b97ea0a9-00ea-409a-ba57-0c68616a1f0a, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 752.287475] env[59659]: DEBUG nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Build of instance e2620c94-5629-4094-a92f-d83d9efd6205 was re-scheduled: Binding failed for port b97ea0a9-00ea-409a-ba57-0c68616a1f0a, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 752.287897] env[59659]: DEBUG nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 752.288139] env[59659]: DEBUG oslo_concurrency.lockutils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Acquiring lock "refresh_cache-e2620c94-5629-4094-a92f-d83d9efd6205" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 752.288284] env[59659]: DEBUG oslo_concurrency.lockutils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Acquired lock "refresh_cache-e2620c94-5629-4094-a92f-d83d9efd6205" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 752.288434] env[59659]: DEBUG nova.network.neutron [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 752.362706] env[59659]: DEBUG nova.network.neutron [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 752.459193] env[59659]: DEBUG nova.network.neutron [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 752.472865] env[59659]: DEBUG oslo_concurrency.lockutils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Releasing lock "refresh_cache-b8404801-b787-4db2-aa13-320f87ca5ac5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 752.473407] env[59659]: DEBUG nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 752.473604] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 752.474216] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5f2eae47-8b4e-46b2-bcc1-e354a187a986 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.486152] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8272a87c-077c-41b0-978d-5cbacba3fcae {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.517467] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b8404801-b787-4db2-aa13-320f87ca5ac5 could not be found. [ 752.517758] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 752.517936] env[59659]: INFO nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 752.518304] env[59659]: DEBUG oslo.service.loopingcall [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 752.518551] env[59659]: DEBUG nova.compute.manager [-] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 752.518645] env[59659]: DEBUG nova.network.neutron [-] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 752.597412] env[59659]: DEBUG nova.network.neutron [-] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 752.607618] env[59659]: DEBUG nova.network.neutron [-] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 752.618793] env[59659]: INFO nova.compute.manager [-] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Took 0.10 seconds to deallocate network for instance. [ 752.621823] env[59659]: DEBUG nova.compute.claims [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 752.622129] env[59659]: DEBUG oslo_concurrency.lockutils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 752.622353] env[59659]: DEBUG oslo_concurrency.lockutils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 752.821564] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd890b79-0522-4537-a773-d35118bb56c7 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.834790] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bba3d04-eef8-4b85-8c10-d4dd2a3a778c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.871373] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f61ea68c-46f0-402e-82be-48202d7375bc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.880195] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f3a5604-5a79-4802-8338-d4b9356ae792 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.894713] env[59659]: DEBUG nova.compute.provider_tree [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 752.903673] env[59659]: DEBUG nova.scheduler.client.report [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 752.919488] env[59659]: DEBUG oslo_concurrency.lockutils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.297s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 752.921369] env[59659]: ERROR nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port b6e44e9f-1e71-4450-80a0-4d9203c470e9, please check neutron logs for more information. [ 752.921369] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Traceback (most recent call last): [ 752.921369] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 752.921369] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] self.driver.spawn(context, instance, image_meta, [ 752.921369] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 752.921369] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 752.921369] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 752.921369] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] vm_ref = self.build_virtual_machine(instance, [ 752.921369] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 752.921369] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] vif_infos = vmwarevif.get_vif_info(self._session, [ 752.921369] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 752.921711] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] for vif in network_info: [ 752.921711] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 752.921711] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] return self._sync_wrapper(fn, *args, **kwargs) [ 752.921711] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 752.921711] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] self.wait() [ 752.921711] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 752.921711] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] self[:] = self._gt.wait() [ 752.921711] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 752.921711] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] return self._exit_event.wait() [ 752.921711] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 752.921711] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] result = hub.switch() [ 752.921711] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 752.921711] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] return self.greenlet.switch() [ 752.921711] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 752.922078] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] result = function(*args, **kwargs) [ 752.922078] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 752.922078] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] return func(*args, **kwargs) [ 752.922078] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 752.922078] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] raise e [ 752.922078] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 752.922078] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] nwinfo = self.network_api.allocate_for_instance( [ 752.922078] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 752.922078] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] created_port_ids = self._update_ports_for_instance( [ 752.922078] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 752.922078] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] with excutils.save_and_reraise_exception(): [ 752.922078] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 752.922078] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] self.force_reraise() [ 752.922389] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 752.922389] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] raise self.value [ 752.922389] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 752.922389] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] updated_port = self._update_port( [ 752.922389] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 752.922389] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] _ensure_no_port_binding_failure(port) [ 752.922389] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 752.922389] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] raise exception.PortBindingFailed(port_id=port['id']) [ 752.922389] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] nova.exception.PortBindingFailed: Binding failed for port b6e44e9f-1e71-4450-80a0-4d9203c470e9, please check neutron logs for more information. [ 752.922389] env[59659]: ERROR nova.compute.manager [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] [ 752.922389] env[59659]: DEBUG nova.compute.utils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Binding failed for port b6e44e9f-1e71-4450-80a0-4d9203c470e9, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 752.924109] env[59659]: DEBUG nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Build of instance b8404801-b787-4db2-aa13-320f87ca5ac5 was re-scheduled: Binding failed for port b6e44e9f-1e71-4450-80a0-4d9203c470e9, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 752.924560] env[59659]: DEBUG nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 752.924780] env[59659]: DEBUG oslo_concurrency.lockutils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquiring lock "refresh_cache-b8404801-b787-4db2-aa13-320f87ca5ac5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 752.924922] env[59659]: DEBUG oslo_concurrency.lockutils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquired lock "refresh_cache-b8404801-b787-4db2-aa13-320f87ca5ac5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 752.925092] env[59659]: DEBUG nova.network.neutron [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 753.009182] env[59659]: DEBUG nova.network.neutron [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 753.071964] env[59659]: DEBUG nova.network.neutron [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 753.086140] env[59659]: DEBUG oslo_concurrency.lockutils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Releasing lock "refresh_cache-e2620c94-5629-4094-a92f-d83d9efd6205" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 753.086330] env[59659]: DEBUG nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 753.086515] env[59659]: DEBUG nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 753.086674] env[59659]: DEBUG nova.network.neutron [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 753.161173] env[59659]: DEBUG nova.network.neutron [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 753.170667] env[59659]: DEBUG nova.network.neutron [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 753.189280] env[59659]: INFO nova.compute.manager [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] [instance: e2620c94-5629-4094-a92f-d83d9efd6205] Took 0.10 seconds to deallocate network for instance. [ 753.297364] env[59659]: INFO nova.scheduler.client.report [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Deleted allocations for instance e2620c94-5629-4094-a92f-d83d9efd6205 [ 753.315334] env[59659]: DEBUG oslo_concurrency.lockutils [None req-feb153f7-953e-4e96-b1a6-23f5996cad56 tempest-AttachVolumeShelveTestJSON-687008755 tempest-AttachVolumeShelveTestJSON-687008755-project-member] Lock "e2620c94-5629-4094-a92f-d83d9efd6205" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.747s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.831622] env[59659]: DEBUG nova.network.neutron [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 753.843074] env[59659]: DEBUG oslo_concurrency.lockutils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Releasing lock "refresh_cache-b8404801-b787-4db2-aa13-320f87ca5ac5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 753.843593] env[59659]: DEBUG nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 753.843847] env[59659]: DEBUG nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 753.843985] env[59659]: DEBUG nova.network.neutron [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 753.935805] env[59659]: DEBUG nova.network.neutron [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 753.943354] env[59659]: DEBUG nova.network.neutron [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 753.954954] env[59659]: INFO nova.compute.manager [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Took 0.11 seconds to deallocate network for instance. [ 754.067344] env[59659]: INFO nova.scheduler.client.report [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Deleted allocations for instance b8404801-b787-4db2-aa13-320f87ca5ac5 [ 754.090395] env[59659]: DEBUG oslo_concurrency.lockutils [None req-3b83ff7a-707a-4d4d-9e7a-aa78aae545c9 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "b8404801-b787-4db2-aa13-320f87ca5ac5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.614s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 754.091163] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "b8404801-b787-4db2-aa13-320f87ca5ac5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 11.318s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 754.091163] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquiring lock "b8404801-b787-4db2-aa13-320f87ca5ac5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 754.091163] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "b8404801-b787-4db2-aa13-320f87ca5ac5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 754.091163] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "b8404801-b787-4db2-aa13-320f87ca5ac5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 754.098123] env[59659]: INFO nova.compute.manager [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Terminating instance [ 754.099418] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquiring lock "refresh_cache-b8404801-b787-4db2-aa13-320f87ca5ac5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 754.099476] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Acquired lock "refresh_cache-b8404801-b787-4db2-aa13-320f87ca5ac5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 754.099631] env[59659]: DEBUG nova.network.neutron [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 754.206217] env[59659]: DEBUG nova.network.neutron [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 754.343291] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 754.343291] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Starting heal instance info cache {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 754.343291] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Rebuilding the list of instances to heal {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 754.370563] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Skipping network cache update for instance because it is Building. {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 754.370814] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Skipping network cache update for instance because it is Building. {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 754.371134] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Skipping network cache update for instance because it is Building. {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 754.371134] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Skipping network cache update for instance because it is Building. {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 754.371134] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Skipping network cache update for instance because it is Building. {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 754.371279] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Skipping network cache update for instance because it is Building. {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 754.371321] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Didn't find any instances for network info cache update. {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 754.644244] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Acquiring lock "1195d592-faa2-43d7-af58-12b75abd5ed0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 754.644244] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Lock "1195d592-faa2-43d7-af58-12b75abd5ed0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 754.657990] env[59659]: DEBUG nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 754.706018] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 754.706018] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 754.706018] env[59659]: INFO nova.compute.claims [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 754.854515] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42dd056a-8e08-468c-9f84-906c0a223dfa {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.862287] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c771f7f5-b92c-4cf3-95e5-564d783de539 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.900131] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fba9ff5a-34e1-41a3-a47b-95e63b93175d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.908907] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31b2ec0a-046c-43e3-9a16-a672c001e2ee {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.924115] env[59659]: DEBUG nova.compute.provider_tree [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 754.937083] env[59659]: DEBUG nova.scheduler.client.report [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 754.952925] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.249s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 754.953596] env[59659]: DEBUG nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 754.994166] env[59659]: DEBUG nova.compute.utils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 754.994453] env[59659]: DEBUG nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 754.994733] env[59659]: DEBUG nova.network.neutron [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 755.005848] env[59659]: DEBUG nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 755.027973] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 755.028428] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 755.077512] env[59659]: DEBUG nova.network.neutron [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.089665] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Releasing lock "refresh_cache-b8404801-b787-4db2-aa13-320f87ca5ac5" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 755.091912] env[59659]: DEBUG nova.compute.manager [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 755.091912] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 755.091912] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4bb0be34-460d-483b-a23c-759753e9291f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.098752] env[59659]: DEBUG nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 755.109024] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdb9aa3e-3d71-4a42-8639-13eae8617f5f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.132729] env[59659]: DEBUG nova.virt.hardware [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 755.133302] env[59659]: DEBUG nova.virt.hardware [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 755.133639] env[59659]: DEBUG nova.virt.hardware [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 755.133943] env[59659]: DEBUG nova.virt.hardware [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 755.135057] env[59659]: DEBUG nova.virt.hardware [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 755.135570] env[59659]: DEBUG nova.virt.hardware [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 755.138018] env[59659]: DEBUG nova.virt.hardware [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 755.138018] env[59659]: DEBUG nova.virt.hardware [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 755.138018] env[59659]: DEBUG nova.virt.hardware [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 755.138018] env[59659]: DEBUG nova.virt.hardware [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 755.138018] env[59659]: DEBUG nova.virt.hardware [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 755.138310] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ded6544-ac12-4b30-a058-37f310fdadd6 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.150018] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b8404801-b787-4db2-aa13-320f87ca5ac5 could not be found. [ 755.150018] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 755.150018] env[59659]: INFO nova.compute.manager [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Took 0.06 seconds to destroy the instance on the hypervisor. [ 755.150018] env[59659]: DEBUG oslo.service.loopingcall [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 755.150328] env[59659]: DEBUG nova.compute.manager [-] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 755.150519] env[59659]: DEBUG nova.network.neutron [-] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 755.157179] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-812c7659-9a9f-42e5-b50f-ab13e9778bca {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.206292] env[59659]: DEBUG nova.network.neutron [-] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.208313] env[59659]: DEBUG nova.policy [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '88574e74047e494a8e13911db17cffc3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '145cbf0470004e0e8f468d7823433a21', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 755.218464] env[59659]: DEBUG nova.network.neutron [-] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.232848] env[59659]: INFO nova.compute.manager [-] [instance: b8404801-b787-4db2-aa13-320f87ca5ac5] Took 0.08 seconds to deallocate network for instance. [ 755.401281] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b9327b63-17c3-4cf9-a544-0081ceef4d95 tempest-ServersTestJSON-1247949540 tempest-ServersTestJSON-1247949540-project-member] Lock "b8404801-b787-4db2-aa13-320f87ca5ac5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.311s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 755.434601] env[59659]: ERROR nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 54e48d56-6f7b-4965-aed4-4439616ed0bf, please check neutron logs for more information. [ 755.434601] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 755.434601] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 755.434601] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 755.434601] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.434601] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 755.434601] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.434601] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 755.434601] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.434601] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 755.434601] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.434601] env[59659]: ERROR nova.compute.manager raise self.value [ 755.434601] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.434601] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 755.434601] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.434601] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 755.435066] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.435066] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 755.435066] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 54e48d56-6f7b-4965-aed4-4439616ed0bf, please check neutron logs for more information. [ 755.435066] env[59659]: ERROR nova.compute.manager [ 755.435066] env[59659]: Traceback (most recent call last): [ 755.435066] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 755.435066] env[59659]: listener.cb(fileno) [ 755.435066] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 755.435066] env[59659]: result = function(*args, **kwargs) [ 755.435066] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 755.435066] env[59659]: return func(*args, **kwargs) [ 755.435066] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 755.435066] env[59659]: raise e [ 755.435066] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 755.435066] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 755.435066] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.435066] env[59659]: created_port_ids = self._update_ports_for_instance( [ 755.435066] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.435066] env[59659]: with excutils.save_and_reraise_exception(): [ 755.435066] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.435066] env[59659]: self.force_reraise() [ 755.435066] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.435066] env[59659]: raise self.value [ 755.435066] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.435066] env[59659]: updated_port = self._update_port( [ 755.435066] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.435066] env[59659]: _ensure_no_port_binding_failure(port) [ 755.435066] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.435066] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 755.435975] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 54e48d56-6f7b-4965-aed4-4439616ed0bf, please check neutron logs for more information. [ 755.435975] env[59659]: Removing descriptor: 21 [ 755.435975] env[59659]: ERROR nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 54e48d56-6f7b-4965-aed4-4439616ed0bf, please check neutron logs for more information. [ 755.435975] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Traceback (most recent call last): [ 755.435975] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 755.435975] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] yield resources [ 755.435975] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 755.435975] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] self.driver.spawn(context, instance, image_meta, [ 755.435975] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 755.435975] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] self._vmops.spawn(context, instance, image_meta, injected_files, [ 755.435975] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 755.435975] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] vm_ref = self.build_virtual_machine(instance, [ 755.436323] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 755.436323] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] vif_infos = vmwarevif.get_vif_info(self._session, [ 755.436323] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 755.436323] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] for vif in network_info: [ 755.436323] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 755.436323] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] return self._sync_wrapper(fn, *args, **kwargs) [ 755.436323] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 755.436323] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] self.wait() [ 755.436323] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 755.436323] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] self[:] = self._gt.wait() [ 755.436323] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 755.436323] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] return self._exit_event.wait() [ 755.436323] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 755.436323] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] result = hub.switch() [ 755.436912] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 755.436912] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] return self.greenlet.switch() [ 755.436912] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 755.436912] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] result = function(*args, **kwargs) [ 755.436912] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 755.436912] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] return func(*args, **kwargs) [ 755.436912] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 755.436912] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] raise e [ 755.436912] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 755.436912] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] nwinfo = self.network_api.allocate_for_instance( [ 755.436912] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.436912] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] created_port_ids = self._update_ports_for_instance( [ 755.436912] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.437296] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] with excutils.save_and_reraise_exception(): [ 755.437296] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.437296] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] self.force_reraise() [ 755.437296] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.437296] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] raise self.value [ 755.437296] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.437296] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] updated_port = self._update_port( [ 755.437296] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.437296] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] _ensure_no_port_binding_failure(port) [ 755.437296] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.437296] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] raise exception.PortBindingFailed(port_id=port['id']) [ 755.437296] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] nova.exception.PortBindingFailed: Binding failed for port 54e48d56-6f7b-4965-aed4-4439616ed0bf, please check neutron logs for more information. [ 755.437296] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] [ 755.437681] env[59659]: INFO nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Terminating instance [ 755.440944] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Acquiring lock "refresh_cache-b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 755.441523] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Acquired lock "refresh_cache-b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 755.441723] env[59659]: DEBUG nova.network.neutron [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 755.512223] env[59659]: DEBUG nova.network.neutron [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.593030] env[59659]: ERROR nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 96439e30-308b-4bba-82b1-71b05db60ec7, please check neutron logs for more information. [ 755.593030] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 755.593030] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 755.593030] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 755.593030] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.593030] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 755.593030] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.593030] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 755.593030] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.593030] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 755.593030] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.593030] env[59659]: ERROR nova.compute.manager raise self.value [ 755.593030] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.593030] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 755.593030] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.593030] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 755.593442] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.593442] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 755.593442] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 96439e30-308b-4bba-82b1-71b05db60ec7, please check neutron logs for more information. [ 755.593442] env[59659]: ERROR nova.compute.manager [ 755.593442] env[59659]: Traceback (most recent call last): [ 755.593442] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 755.593442] env[59659]: listener.cb(fileno) [ 755.593442] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 755.593442] env[59659]: result = function(*args, **kwargs) [ 755.593442] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 755.593442] env[59659]: return func(*args, **kwargs) [ 755.593442] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 755.593442] env[59659]: raise e [ 755.593442] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 755.593442] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 755.593442] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.593442] env[59659]: created_port_ids = self._update_ports_for_instance( [ 755.593442] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.593442] env[59659]: with excutils.save_and_reraise_exception(): [ 755.593442] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.593442] env[59659]: self.force_reraise() [ 755.593442] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.593442] env[59659]: raise self.value [ 755.593442] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.593442] env[59659]: updated_port = self._update_port( [ 755.593442] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.593442] env[59659]: _ensure_no_port_binding_failure(port) [ 755.593442] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.593442] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 755.594513] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 96439e30-308b-4bba-82b1-71b05db60ec7, please check neutron logs for more information. [ 755.594513] env[59659]: Removing descriptor: 23 [ 755.594513] env[59659]: ERROR nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 96439e30-308b-4bba-82b1-71b05db60ec7, please check neutron logs for more information. [ 755.594513] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] Traceback (most recent call last): [ 755.594513] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 755.594513] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] yield resources [ 755.594513] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 755.594513] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] self.driver.spawn(context, instance, image_meta, [ 755.594513] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 755.594513] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] self._vmops.spawn(context, instance, image_meta, injected_files, [ 755.594513] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 755.594513] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] vm_ref = self.build_virtual_machine(instance, [ 755.594831] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 755.594831] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] vif_infos = vmwarevif.get_vif_info(self._session, [ 755.594831] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 755.594831] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] for vif in network_info: [ 755.594831] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 755.594831] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] return self._sync_wrapper(fn, *args, **kwargs) [ 755.594831] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 755.594831] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] self.wait() [ 755.594831] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 755.594831] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] self[:] = self._gt.wait() [ 755.594831] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 755.594831] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] return self._exit_event.wait() [ 755.594831] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 755.594831] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] result = hub.switch() [ 755.595370] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 755.595370] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] return self.greenlet.switch() [ 755.595370] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 755.595370] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] result = function(*args, **kwargs) [ 755.595370] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 755.595370] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] return func(*args, **kwargs) [ 755.595370] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 755.595370] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] raise e [ 755.595370] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 755.595370] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] nwinfo = self.network_api.allocate_for_instance( [ 755.595370] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.595370] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] created_port_ids = self._update_ports_for_instance( [ 755.595370] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.595870] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] with excutils.save_and_reraise_exception(): [ 755.595870] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.595870] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] self.force_reraise() [ 755.595870] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.595870] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] raise self.value [ 755.595870] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.595870] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] updated_port = self._update_port( [ 755.595870] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.595870] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] _ensure_no_port_binding_failure(port) [ 755.595870] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.595870] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] raise exception.PortBindingFailed(port_id=port['id']) [ 755.595870] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] nova.exception.PortBindingFailed: Binding failed for port 96439e30-308b-4bba-82b1-71b05db60ec7, please check neutron logs for more information. [ 755.595870] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] [ 755.596298] env[59659]: INFO nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Terminating instance [ 755.596448] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Acquiring lock "refresh_cache-f05a805d-7896-477c-b2ea-437faec88fba" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 755.597711] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Acquired lock "refresh_cache-f05a805d-7896-477c-b2ea-437faec88fba" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 755.597711] env[59659]: DEBUG nova.network.neutron [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 755.858984] env[59659]: DEBUG nova.network.neutron [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 756.027510] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 756.027786] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 756.042979] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.042979] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.043397] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.043397] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59659) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 756.045441] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce6cbd5d-cc40-4566-89c6-f65909cdf697 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.057726] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28b9590f-bb57-4914-90f9-d4011fe8d926 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.075450] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7380e10d-90dd-40d2-bc97-2c22f9509b4c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.085023] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e39c65f4-6e8e-4f31-80a8-dd998e5d1bc1 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.091682] env[59659]: DEBUG nova.network.neutron [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 756.122391] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181438MB free_disk=176GB free_vcpus=48 pci_devices=None {{(pid=59659) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 756.122468] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.122607] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.125537] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Releasing lock "refresh_cache-b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 756.125930] env[59659]: DEBUG nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 756.126148] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 756.126832] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-86634265-0c44-437a-a337-402c22c12e1a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.137027] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14039768-76be-4aae-97f0-39091cf21124 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.162731] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30 could not be found. [ 756.162945] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 756.163144] env[59659]: INFO nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Took 0.04 seconds to destroy the instance on the hypervisor. [ 756.163387] env[59659]: DEBUG oslo.service.loopingcall [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 756.164739] env[59659]: DEBUG nova.compute.manager [-] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 756.164739] env[59659]: DEBUG nova.network.neutron [-] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 756.202361] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Instance ea968312-62ea-4f55-87e9-f91823fc14c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59659) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 756.202516] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Instance b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59659) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 756.202642] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Instance f05a805d-7896-477c-b2ea-437faec88fba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59659) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 756.202765] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Instance 78ed17da-e8e8-4872-b1bf-95c4e77de8e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59659) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 756.202881] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Instance abec9f87-4cde-4b5e-ad2a-fa682842ac7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59659) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 756.202997] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Instance a75a3491-94b0-4754-8e42-7bf49194a022 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59659) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 756.203129] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Instance 1195d592-faa2-43d7-af58-12b75abd5ed0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59659) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 756.203315] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=59659) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 756.203456] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=59659) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 756.232660] env[59659]: DEBUG nova.network.neutron [-] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 756.241101] env[59659]: DEBUG nova.network.neutron [-] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 756.251481] env[59659]: INFO nova.compute.manager [-] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Took 0.09 seconds to deallocate network for instance. [ 756.253873] env[59659]: DEBUG nova.compute.claims [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 756.254175] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.304418] env[59659]: DEBUG nova.network.neutron [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 756.313350] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Releasing lock "refresh_cache-f05a805d-7896-477c-b2ea-437faec88fba" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 756.313771] env[59659]: DEBUG nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 756.313908] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 756.314502] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5065275b-6e70-4eca-be9d-c364dd7fc2c9 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.318082] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1b8591c-d546-4606-b385-9ecafd9afbe5 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.327237] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1dadaac-cd47-40b4-8209-1e7a749057ca {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.333444] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9734c860-50b3-4210-ab1e-a9edc02bed20 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.379603] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7351ed96-b38f-4c15-81f8-aa6159b8983c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.383050] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f05a805d-7896-477c-b2ea-437faec88fba could not be found. [ 756.383262] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 756.383439] env[59659]: INFO nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Took 0.07 seconds to destroy the instance on the hypervisor. [ 756.383879] env[59659]: DEBUG oslo.service.loopingcall [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 756.384234] env[59659]: DEBUG nova.compute.manager [-] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 756.384329] env[59659]: DEBUG nova.network.neutron [-] [instance: f05a805d-7896-477c-b2ea-437faec88fba] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 756.391862] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b12b6f6-938d-4ce4-96f4-096aed14d22b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.406176] env[59659]: DEBUG nova.compute.provider_tree [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 756.414916] env[59659]: DEBUG nova.scheduler.client.report [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 756.429329] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59659) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 756.429509] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.307s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.429769] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.176s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.450744] env[59659]: DEBUG nova.network.neutron [-] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 756.463764] env[59659]: DEBUG nova.network.neutron [-] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 756.473559] env[59659]: INFO nova.compute.manager [-] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Took 0.09 seconds to deallocate network for instance. [ 756.476263] env[59659]: DEBUG nova.compute.claims [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 756.476594] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.543840] env[59659]: DEBUG nova.network.neutron [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Successfully created port: 8ff4da7c-e6f1-47e2-8c4b-f5abfbc23ffa {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 756.609614] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-002bb7ab-07e6-4e40-9abe-7b8081a4221c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.620092] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8bffed1-01e6-4197-9039-1f5e677e4e74 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.658083] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4b276c8-6ef2-41dc-a153-10e284d7232d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.665548] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-890abf59-a547-405e-8493-84e7d3f91e6b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.685264] env[59659]: DEBUG nova.compute.provider_tree [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 756.700885] env[59659]: DEBUG nova.scheduler.client.report [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 756.721541] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.292s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.724475] env[59659]: ERROR nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 54e48d56-6f7b-4965-aed4-4439616ed0bf, please check neutron logs for more information. [ 756.724475] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Traceback (most recent call last): [ 756.724475] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 756.724475] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] self.driver.spawn(context, instance, image_meta, [ 756.724475] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 756.724475] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] self._vmops.spawn(context, instance, image_meta, injected_files, [ 756.724475] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 756.724475] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] vm_ref = self.build_virtual_machine(instance, [ 756.724475] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 756.724475] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] vif_infos = vmwarevif.get_vif_info(self._session, [ 756.724475] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 756.724814] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] for vif in network_info: [ 756.724814] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 756.724814] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] return self._sync_wrapper(fn, *args, **kwargs) [ 756.724814] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 756.724814] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] self.wait() [ 756.724814] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 756.724814] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] self[:] = self._gt.wait() [ 756.724814] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 756.724814] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] return self._exit_event.wait() [ 756.724814] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 756.724814] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] result = hub.switch() [ 756.724814] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 756.724814] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] return self.greenlet.switch() [ 756.724814] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 756.725213] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] result = function(*args, **kwargs) [ 756.725213] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 756.725213] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] return func(*args, **kwargs) [ 756.725213] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 756.725213] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] raise e [ 756.725213] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 756.725213] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] nwinfo = self.network_api.allocate_for_instance( [ 756.725213] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 756.725213] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] created_port_ids = self._update_ports_for_instance( [ 756.725213] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 756.725213] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] with excutils.save_and_reraise_exception(): [ 756.725213] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 756.725213] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] self.force_reraise() [ 756.725590] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 756.725590] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] raise self.value [ 756.725590] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 756.725590] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] updated_port = self._update_port( [ 756.725590] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 756.725590] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] _ensure_no_port_binding_failure(port) [ 756.725590] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 756.725590] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] raise exception.PortBindingFailed(port_id=port['id']) [ 756.725590] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] nova.exception.PortBindingFailed: Binding failed for port 54e48d56-6f7b-4965-aed4-4439616ed0bf, please check neutron logs for more information. [ 756.725590] env[59659]: ERROR nova.compute.manager [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] [ 756.725873] env[59659]: DEBUG nova.compute.utils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Binding failed for port 54e48d56-6f7b-4965-aed4-4439616ed0bf, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 756.728034] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.252s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.732340] env[59659]: DEBUG nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Build of instance b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30 was re-scheduled: Binding failed for port 54e48d56-6f7b-4965-aed4-4439616ed0bf, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 756.732573] env[59659]: DEBUG nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 756.732811] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Acquiring lock "refresh_cache-b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 756.732929] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Acquired lock "refresh_cache-b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 756.733110] env[59659]: DEBUG nova.network.neutron [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 756.921171] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-092863a2-62b6-40f8-ab32-a4fe10bb784b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.934058] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b067721c-9afe-46a3-8aad-32d70d993290 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.939821] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Acquiring lock "0ed4be35-b845-48ca-b892-657d96c12728" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.940264] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Lock "0ed4be35-b845-48ca-b892-657d96c12728" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.973480] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d73fa99d-e9be-4ced-b082-ddf0768a11ab {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.977256] env[59659]: DEBUG nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 756.989343] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-951cc486-a7ed-4f34-b071-b197b6d5c19f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.007398] env[59659]: DEBUG nova.compute.provider_tree [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 757.020475] env[59659]: DEBUG nova.scheduler.client.report [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 757.023932] env[59659]: DEBUG nova.network.neutron [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.031939] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 757.041487] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.313s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 757.042338] env[59659]: ERROR nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 96439e30-308b-4bba-82b1-71b05db60ec7, please check neutron logs for more information. [ 757.042338] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] Traceback (most recent call last): [ 757.042338] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 757.042338] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] self.driver.spawn(context, instance, image_meta, [ 757.042338] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 757.042338] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] self._vmops.spawn(context, instance, image_meta, injected_files, [ 757.042338] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 757.042338] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] vm_ref = self.build_virtual_machine(instance, [ 757.042338] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 757.042338] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] vif_infos = vmwarevif.get_vif_info(self._session, [ 757.042338] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 757.042843] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] for vif in network_info: [ 757.042843] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 757.042843] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] return self._sync_wrapper(fn, *args, **kwargs) [ 757.042843] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 757.042843] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] self.wait() [ 757.042843] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 757.042843] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] self[:] = self._gt.wait() [ 757.042843] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 757.042843] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] return self._exit_event.wait() [ 757.042843] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 757.042843] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] result = hub.switch() [ 757.042843] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 757.042843] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] return self.greenlet.switch() [ 757.042843] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 757.044093] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] result = function(*args, **kwargs) [ 757.044093] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 757.044093] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] return func(*args, **kwargs) [ 757.044093] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 757.044093] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] raise e [ 757.044093] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 757.044093] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] nwinfo = self.network_api.allocate_for_instance( [ 757.044093] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 757.044093] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] created_port_ids = self._update_ports_for_instance( [ 757.044093] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 757.044093] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] with excutils.save_and_reraise_exception(): [ 757.044093] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 757.044093] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] self.force_reraise() [ 757.044449] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 757.044449] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] raise self.value [ 757.044449] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 757.044449] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] updated_port = self._update_port( [ 757.044449] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 757.044449] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] _ensure_no_port_binding_failure(port) [ 757.044449] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 757.044449] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] raise exception.PortBindingFailed(port_id=port['id']) [ 757.044449] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] nova.exception.PortBindingFailed: Binding failed for port 96439e30-308b-4bba-82b1-71b05db60ec7, please check neutron logs for more information. [ 757.044449] env[59659]: ERROR nova.compute.manager [instance: f05a805d-7896-477c-b2ea-437faec88fba] [ 757.044449] env[59659]: DEBUG nova.compute.utils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Binding failed for port 96439e30-308b-4bba-82b1-71b05db60ec7, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 757.044739] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.012s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 757.045763] env[59659]: INFO nova.compute.claims [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 757.048553] env[59659]: DEBUG nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Build of instance f05a805d-7896-477c-b2ea-437faec88fba was re-scheduled: Binding failed for port 96439e30-308b-4bba-82b1-71b05db60ec7, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 757.048990] env[59659]: DEBUG nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 757.049220] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Acquiring lock "refresh_cache-f05a805d-7896-477c-b2ea-437faec88fba" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 757.049362] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Acquired lock "refresh_cache-f05a805d-7896-477c-b2ea-437faec88fba" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 757.049512] env[59659]: DEBUG nova.network.neutron [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 757.836129] env[59659]: DEBUG nova.network.neutron [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.839194] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 757.843726] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 757.844060] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Acquiring lock "938a2016-8eaa-446a-b69c-3af59448d944" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 757.844267] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Lock "938a2016-8eaa-446a-b69c-3af59448d944" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 757.845981] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 757.846847] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 757.847015] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59659) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 757.859407] env[59659]: DEBUG nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 757.920140] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 758.013452] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09d1f501-9d66-49cc-a677-ce8bcfa5eb74 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.022633] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bab199b-b092-43cd-a9c8-040f6083cc8b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.055512] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b56a82d-8be5-4510-9f7f-d88573e4994c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.061702] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b0fb94c-2507-4cef-9ee5-a5eee4807780 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.077193] env[59659]: DEBUG nova.compute.provider_tree [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 758.086904] env[59659]: DEBUG nova.scheduler.client.report [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 758.105140] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.061s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 758.105660] env[59659]: DEBUG nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 758.112206] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.189s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 758.112206] env[59659]: INFO nova.compute.claims [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 758.141936] env[59659]: DEBUG nova.compute.utils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 758.147019] env[59659]: DEBUG nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 758.147019] env[59659]: DEBUG nova.network.neutron [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 758.153805] env[59659]: DEBUG nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 758.228775] env[59659]: DEBUG nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 758.251203] env[59659]: DEBUG nova.virt.hardware [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 758.251539] env[59659]: DEBUG nova.virt.hardware [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 758.251718] env[59659]: DEBUG nova.virt.hardware [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 758.251899] env[59659]: DEBUG nova.virt.hardware [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 758.252054] env[59659]: DEBUG nova.virt.hardware [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 758.252200] env[59659]: DEBUG nova.virt.hardware [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 758.252402] env[59659]: DEBUG nova.virt.hardware [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 758.252550] env[59659]: DEBUG nova.virt.hardware [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 758.252706] env[59659]: DEBUG nova.virt.hardware [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 758.252858] env[59659]: DEBUG nova.virt.hardware [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 758.253036] env[59659]: DEBUG nova.virt.hardware [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 758.253883] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1b50992-f9cc-469f-958b-b750fe00cd84 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.263177] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fdbc972-a102-430d-aae5-b2be4ed726da {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.294029] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63d83a46-5146-49df-a7a9-4fd9484fcded {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.301939] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9a15d00-4667-4fa3-95e2-ac40fe1e1f31 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.332063] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c9c0617-92fa-4f2a-8df5-6d15bcb5ba55 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.340486] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd7a762e-79a4-432a-a2f3-debec3472b73 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.357114] env[59659]: DEBUG nova.compute.provider_tree [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 758.368141] env[59659]: DEBUG nova.scheduler.client.report [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 758.385329] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.276s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 758.385329] env[59659]: DEBUG nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 758.417446] env[59659]: DEBUG nova.network.neutron [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 758.422178] env[59659]: DEBUG nova.compute.utils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 758.423593] env[59659]: DEBUG nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 758.423756] env[59659]: DEBUG nova.network.neutron [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 758.429678] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Releasing lock "refresh_cache-b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 758.429850] env[59659]: DEBUG nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 758.430030] env[59659]: DEBUG nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 758.430480] env[59659]: DEBUG nova.network.neutron [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 758.432260] env[59659]: DEBUG nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 758.456246] env[59659]: DEBUG nova.network.neutron [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 758.464418] env[59659]: DEBUG nova.network.neutron [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 758.483652] env[59659]: INFO nova.compute.manager [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] [instance: b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30] Took 0.05 seconds to deallocate network for instance. [ 758.514123] env[59659]: DEBUG nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 758.536263] env[59659]: DEBUG nova.virt.hardware [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 758.536503] env[59659]: DEBUG nova.virt.hardware [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 758.536781] env[59659]: DEBUG nova.virt.hardware [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 758.536849] env[59659]: DEBUG nova.virt.hardware [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 758.536957] env[59659]: DEBUG nova.virt.hardware [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 758.537108] env[59659]: DEBUG nova.virt.hardware [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 758.537342] env[59659]: DEBUG nova.virt.hardware [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 758.537609] env[59659]: DEBUG nova.virt.hardware [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 758.537940] env[59659]: DEBUG nova.virt.hardware [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 758.538139] env[59659]: DEBUG nova.virt.hardware [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 758.538309] env[59659]: DEBUG nova.virt.hardware [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 758.540291] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f40f2e6-79a6-4dbd-acc9-df109dcb1869 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.551526] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ff126ce-4cdd-43ee-be3c-2c14b41ad39b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.589099] env[59659]: INFO nova.scheduler.client.report [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Deleted allocations for instance b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30 [ 758.612921] env[59659]: DEBUG nova.policy [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '94b356ff05594435a3bde51ddccf333b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f8b200f828624b43b290c7894018dec6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 758.618313] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6e21e153-a5d9-418f-91b1-946ea426e46d tempest-FloatingIPsAssociationTestJSON-246739974 tempest-FloatingIPsAssociationTestJSON-246739974-project-member] Lock "b4a2d6ae-f580-43a6-b1b7-9a4727c3ac30" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.320s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 758.808683] env[59659]: DEBUG nova.policy [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56b440aff52c4608a2db2ddbe818847b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '392af2151280472aa22ca8faa2f253ce', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 758.810368] env[59659]: DEBUG nova.network.neutron [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 758.824299] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Releasing lock "refresh_cache-f05a805d-7896-477c-b2ea-437faec88fba" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 758.824299] env[59659]: DEBUG nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 758.824299] env[59659]: DEBUG nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 758.824299] env[59659]: DEBUG nova.network.neutron [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 758.868704] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Acquiring lock "72a92098-562e-47bf-8dde-8b62b182d7bb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 758.868939] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Lock "72a92098-562e-47bf-8dde-8b62b182d7bb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 758.883092] env[59659]: DEBUG nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 758.919802] env[59659]: DEBUG nova.network.neutron [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 758.928818] env[59659]: DEBUG nova.network.neutron [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 758.938721] env[59659]: INFO nova.compute.manager [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] [instance: f05a805d-7896-477c-b2ea-437faec88fba] Took 0.11 seconds to deallocate network for instance. [ 758.951355] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 758.951591] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 758.953153] env[59659]: INFO nova.compute.claims [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 759.044974] env[59659]: INFO nova.scheduler.client.report [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Deleted allocations for instance f05a805d-7896-477c-b2ea-437faec88fba [ 759.077541] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20a0ea14-6eb7-4fc0-84ed-491fc82e1ed9 tempest-ImagesTestJSON-1854810537 tempest-ImagesTestJSON-1854810537-project-member] Lock "f05a805d-7896-477c-b2ea-437faec88fba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.755s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 759.159381] env[59659]: ERROR nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 8ff4da7c-e6f1-47e2-8c4b-f5abfbc23ffa, please check neutron logs for more information. [ 759.159381] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 759.159381] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 759.159381] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 759.159381] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.159381] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 759.159381] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.159381] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 759.159381] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.159381] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 759.159381] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.159381] env[59659]: ERROR nova.compute.manager raise self.value [ 759.159381] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.159381] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 759.159381] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.159381] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 759.163193] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.163193] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 759.163193] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 8ff4da7c-e6f1-47e2-8c4b-f5abfbc23ffa, please check neutron logs for more information. [ 759.163193] env[59659]: ERROR nova.compute.manager [ 759.163193] env[59659]: Traceback (most recent call last): [ 759.163193] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 759.163193] env[59659]: listener.cb(fileno) [ 759.163193] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 759.163193] env[59659]: result = function(*args, **kwargs) [ 759.163193] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 759.163193] env[59659]: return func(*args, **kwargs) [ 759.163193] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 759.163193] env[59659]: raise e [ 759.163193] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 759.163193] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 759.163193] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.163193] env[59659]: created_port_ids = self._update_ports_for_instance( [ 759.163193] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.163193] env[59659]: with excutils.save_and_reraise_exception(): [ 759.163193] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.163193] env[59659]: self.force_reraise() [ 759.163193] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.163193] env[59659]: raise self.value [ 759.163193] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.163193] env[59659]: updated_port = self._update_port( [ 759.163193] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.163193] env[59659]: _ensure_no_port_binding_failure(port) [ 759.163193] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.163193] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 759.164409] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 8ff4da7c-e6f1-47e2-8c4b-f5abfbc23ffa, please check neutron logs for more information. [ 759.164409] env[59659]: Removing descriptor: 12 [ 759.164409] env[59659]: ERROR nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 8ff4da7c-e6f1-47e2-8c4b-f5abfbc23ffa, please check neutron logs for more information. [ 759.164409] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Traceback (most recent call last): [ 759.164409] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 759.164409] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] yield resources [ 759.164409] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 759.164409] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] self.driver.spawn(context, instance, image_meta, [ 759.164409] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 759.164409] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 759.164409] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 759.164409] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] vm_ref = self.build_virtual_machine(instance, [ 759.164780] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 759.164780] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] vif_infos = vmwarevif.get_vif_info(self._session, [ 759.164780] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 759.164780] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] for vif in network_info: [ 759.164780] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 759.164780] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] return self._sync_wrapper(fn, *args, **kwargs) [ 759.164780] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 759.164780] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] self.wait() [ 759.164780] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 759.164780] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] self[:] = self._gt.wait() [ 759.164780] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 759.164780] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] return self._exit_event.wait() [ 759.164780] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 759.164780] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] result = hub.switch() [ 759.165189] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 759.165189] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] return self.greenlet.switch() [ 759.165189] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 759.165189] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] result = function(*args, **kwargs) [ 759.165189] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 759.165189] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] return func(*args, **kwargs) [ 759.165189] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 759.165189] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] raise e [ 759.165189] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 759.165189] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] nwinfo = self.network_api.allocate_for_instance( [ 759.165189] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.165189] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] created_port_ids = self._update_ports_for_instance( [ 759.165189] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.165515] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] with excutils.save_and_reraise_exception(): [ 759.165515] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.165515] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] self.force_reraise() [ 759.165515] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.165515] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] raise self.value [ 759.165515] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.165515] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] updated_port = self._update_port( [ 759.165515] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.165515] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] _ensure_no_port_binding_failure(port) [ 759.165515] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.165515] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] raise exception.PortBindingFailed(port_id=port['id']) [ 759.165515] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] nova.exception.PortBindingFailed: Binding failed for port 8ff4da7c-e6f1-47e2-8c4b-f5abfbc23ffa, please check neutron logs for more information. [ 759.165515] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] [ 759.165837] env[59659]: INFO nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Terminating instance [ 759.168040] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Acquiring lock "refresh_cache-1195d592-faa2-43d7-af58-12b75abd5ed0" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 759.168040] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Acquired lock "refresh_cache-1195d592-faa2-43d7-af58-12b75abd5ed0" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 759.168040] env[59659]: DEBUG nova.network.neutron [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 759.182894] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51bcfdba-a480-4b61-9e92-2cd58b6c280e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 759.193588] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7833f74-ed63-42c1-9425-1328e0982491 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 759.224296] env[59659]: DEBUG nova.network.neutron [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 759.226716] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4aa20b5-f9bd-40f1-82d1-28411ce83d78 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 759.235775] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e5cb4a1-2c85-4e91-8e19-2a82e7b10c36 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 759.251098] env[59659]: DEBUG nova.compute.provider_tree [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 759.264736] env[59659]: DEBUG nova.scheduler.client.report [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 759.284952] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.333s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 759.285574] env[59659]: DEBUG nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 759.324474] env[59659]: DEBUG nova.compute.utils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 759.326080] env[59659]: DEBUG nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 759.326627] env[59659]: DEBUG nova.network.neutron [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 759.335031] env[59659]: DEBUG nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 759.412223] env[59659]: DEBUG nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 759.436168] env[59659]: DEBUG nova.virt.hardware [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 759.436320] env[59659]: DEBUG nova.virt.hardware [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 759.436377] env[59659]: DEBUG nova.virt.hardware [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 759.436544] env[59659]: DEBUG nova.virt.hardware [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 759.436713] env[59659]: DEBUG nova.virt.hardware [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 759.437088] env[59659]: DEBUG nova.virt.hardware [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 759.437266] env[59659]: DEBUG nova.virt.hardware [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 759.437500] env[59659]: DEBUG nova.virt.hardware [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 759.437732] env[59659]: DEBUG nova.virt.hardware [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 759.437907] env[59659]: DEBUG nova.virt.hardware [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 759.438204] env[59659]: DEBUG nova.virt.hardware [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 759.439074] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04a3fecc-049a-4e3b-956c-cf28cb343e29 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 759.448919] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bed68ad-35e3-46a3-a3d1-bc3c81a6bee2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 759.639417] env[59659]: ERROR nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port f3f7985f-2ffb-489b-bf50-5a30759b413b, please check neutron logs for more information. [ 759.639417] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 759.639417] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 759.639417] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 759.639417] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.639417] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 759.639417] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.639417] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 759.639417] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.639417] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 759.639417] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.639417] env[59659]: ERROR nova.compute.manager raise self.value [ 759.639417] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.639417] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 759.639417] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.639417] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 759.642126] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.642126] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 759.642126] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port f3f7985f-2ffb-489b-bf50-5a30759b413b, please check neutron logs for more information. [ 759.642126] env[59659]: ERROR nova.compute.manager [ 759.642126] env[59659]: Traceback (most recent call last): [ 759.642126] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 759.642126] env[59659]: listener.cb(fileno) [ 759.642126] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 759.642126] env[59659]: result = function(*args, **kwargs) [ 759.642126] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 759.642126] env[59659]: return func(*args, **kwargs) [ 759.642126] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 759.642126] env[59659]: raise e [ 759.642126] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 759.642126] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 759.642126] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.642126] env[59659]: created_port_ids = self._update_ports_for_instance( [ 759.642126] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.642126] env[59659]: with excutils.save_and_reraise_exception(): [ 759.642126] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.642126] env[59659]: self.force_reraise() [ 759.642126] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.642126] env[59659]: raise self.value [ 759.642126] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.642126] env[59659]: updated_port = self._update_port( [ 759.642126] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.642126] env[59659]: _ensure_no_port_binding_failure(port) [ 759.642126] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.642126] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 759.643159] env[59659]: nova.exception.PortBindingFailed: Binding failed for port f3f7985f-2ffb-489b-bf50-5a30759b413b, please check neutron logs for more information. [ 759.643159] env[59659]: Removing descriptor: 17 [ 759.643159] env[59659]: ERROR nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port f3f7985f-2ffb-489b-bf50-5a30759b413b, please check neutron logs for more information. [ 759.643159] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Traceback (most recent call last): [ 759.643159] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 759.643159] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] yield resources [ 759.643159] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 759.643159] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] self.driver.spawn(context, instance, image_meta, [ 759.643159] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 759.643159] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 759.643159] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 759.643159] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] vm_ref = self.build_virtual_machine(instance, [ 759.643465] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 759.643465] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] vif_infos = vmwarevif.get_vif_info(self._session, [ 759.643465] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 759.643465] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] for vif in network_info: [ 759.643465] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 759.643465] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] return self._sync_wrapper(fn, *args, **kwargs) [ 759.643465] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 759.643465] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] self.wait() [ 759.643465] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 759.643465] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] self[:] = self._gt.wait() [ 759.643465] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 759.643465] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] return self._exit_event.wait() [ 759.643465] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 759.643465] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] result = hub.switch() [ 759.643808] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 759.643808] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] return self.greenlet.switch() [ 759.643808] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 759.643808] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] result = function(*args, **kwargs) [ 759.643808] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 759.643808] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] return func(*args, **kwargs) [ 759.643808] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 759.643808] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] raise e [ 759.643808] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 759.643808] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] nwinfo = self.network_api.allocate_for_instance( [ 759.643808] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 759.643808] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] created_port_ids = self._update_ports_for_instance( [ 759.643808] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 759.644141] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] with excutils.save_and_reraise_exception(): [ 759.644141] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 759.644141] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] self.force_reraise() [ 759.644141] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 759.644141] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] raise self.value [ 759.644141] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 759.644141] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] updated_port = self._update_port( [ 759.644141] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 759.644141] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] _ensure_no_port_binding_failure(port) [ 759.644141] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 759.644141] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] raise exception.PortBindingFailed(port_id=port['id']) [ 759.644141] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] nova.exception.PortBindingFailed: Binding failed for port f3f7985f-2ffb-489b-bf50-5a30759b413b, please check neutron logs for more information. [ 759.644141] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] [ 759.644510] env[59659]: INFO nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Terminating instance [ 759.648740] env[59659]: DEBUG oslo_concurrency.lockutils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquiring lock "refresh_cache-abec9f87-4cde-4b5e-ad2a-fa682842ac7a" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 759.648740] env[59659]: DEBUG oslo_concurrency.lockutils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquired lock "refresh_cache-abec9f87-4cde-4b5e-ad2a-fa682842ac7a" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 759.648740] env[59659]: DEBUG nova.network.neutron [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 759.690151] env[59659]: DEBUG nova.network.neutron [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 759.699932] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Releasing lock "refresh_cache-1195d592-faa2-43d7-af58-12b75abd5ed0" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 759.700471] env[59659]: DEBUG nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 759.700539] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 759.701152] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ae16218a-4fc9-42c0-a01e-19995a818726 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 759.706468] env[59659]: DEBUG nova.policy [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b8d25c6dcda1421b82c920c9580bf020', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f858ee4b23fb49c399f103c4a8bcdebc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 759.716419] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52006278-e00f-460f-9ecf-b26e6a52cc36 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 759.746555] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1195d592-faa2-43d7-af58-12b75abd5ed0 could not be found. [ 759.747197] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 759.747197] env[59659]: INFO nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Took 0.05 seconds to destroy the instance on the hypervisor. [ 759.747320] env[59659]: DEBUG oslo.service.loopingcall [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 759.748156] env[59659]: DEBUG nova.network.neutron [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 759.750153] env[59659]: DEBUG nova.compute.manager [-] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 759.750254] env[59659]: DEBUG nova.network.neutron [-] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 759.800889] env[59659]: DEBUG nova.network.neutron [-] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 759.809410] env[59659]: DEBUG nova.network.neutron [-] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 759.822636] env[59659]: INFO nova.compute.manager [-] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Took 0.07 seconds to deallocate network for instance. [ 759.825173] env[59659]: DEBUG nova.compute.claims [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 759.825422] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 759.825698] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.054621] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-269baf0f-0d7f-415d-a3f6-b6c00ab8192e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.064262] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ef50bfb-ed61-44c1-b4dc-1bca0fa9522a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.100653] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c49b878f-3c1c-44c4-a35a-10fe2bfd545e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.111763] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b322c593-84ca-40cb-9468-8bb4b975a5ca {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.127140] env[59659]: DEBUG nova.compute.provider_tree [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 760.143897] env[59659]: DEBUG nova.scheduler.client.report [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 760.169403] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.343s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 760.169403] env[59659]: ERROR nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 8ff4da7c-e6f1-47e2-8c4b-f5abfbc23ffa, please check neutron logs for more information. [ 760.169403] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Traceback (most recent call last): [ 760.169403] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 760.169403] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] self.driver.spawn(context, instance, image_meta, [ 760.169403] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 760.169403] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 760.169403] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 760.169403] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] vm_ref = self.build_virtual_machine(instance, [ 760.169724] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 760.169724] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] vif_infos = vmwarevif.get_vif_info(self._session, [ 760.169724] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 760.169724] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] for vif in network_info: [ 760.169724] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 760.169724] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] return self._sync_wrapper(fn, *args, **kwargs) [ 760.169724] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 760.169724] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] self.wait() [ 760.169724] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 760.169724] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] self[:] = self._gt.wait() [ 760.169724] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 760.169724] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] return self._exit_event.wait() [ 760.169724] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 760.169724] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] result = hub.switch() [ 760.170169] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 760.170169] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] return self.greenlet.switch() [ 760.170169] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 760.170169] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] result = function(*args, **kwargs) [ 760.170169] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 760.170169] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] return func(*args, **kwargs) [ 760.170169] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 760.170169] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] raise e [ 760.170169] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 760.170169] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] nwinfo = self.network_api.allocate_for_instance( [ 760.170169] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 760.170169] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] created_port_ids = self._update_ports_for_instance( [ 760.170169] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 760.170613] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] with excutils.save_and_reraise_exception(): [ 760.170613] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 760.170613] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] self.force_reraise() [ 760.170613] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 760.170613] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] raise self.value [ 760.170613] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 760.170613] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] updated_port = self._update_port( [ 760.170613] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 760.170613] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] _ensure_no_port_binding_failure(port) [ 760.170613] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 760.170613] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] raise exception.PortBindingFailed(port_id=port['id']) [ 760.170613] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] nova.exception.PortBindingFailed: Binding failed for port 8ff4da7c-e6f1-47e2-8c4b-f5abfbc23ffa, please check neutron logs for more information. [ 760.170613] env[59659]: ERROR nova.compute.manager [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] [ 760.170954] env[59659]: DEBUG nova.compute.utils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Binding failed for port 8ff4da7c-e6f1-47e2-8c4b-f5abfbc23ffa, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 760.174987] env[59659]: DEBUG nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Build of instance 1195d592-faa2-43d7-af58-12b75abd5ed0 was re-scheduled: Binding failed for port 8ff4da7c-e6f1-47e2-8c4b-f5abfbc23ffa, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 760.174987] env[59659]: DEBUG nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 760.174987] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Acquiring lock "refresh_cache-1195d592-faa2-43d7-af58-12b75abd5ed0" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 760.174987] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Acquired lock "refresh_cache-1195d592-faa2-43d7-af58-12b75abd5ed0" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 760.175414] env[59659]: DEBUG nova.network.neutron [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 760.211987] env[59659]: DEBUG nova.network.neutron [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 760.287460] env[59659]: DEBUG nova.network.neutron [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.295172] env[59659]: DEBUG oslo_concurrency.lockutils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Acquiring lock "2d63a2a4-b912-487e-aa10-9e68d877baab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.295172] env[59659]: DEBUG oslo_concurrency.lockutils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Lock "2d63a2a4-b912-487e-aa10-9e68d877baab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.301515] env[59659]: DEBUG oslo_concurrency.lockutils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Releasing lock "refresh_cache-abec9f87-4cde-4b5e-ad2a-fa682842ac7a" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 760.301906] env[59659]: DEBUG nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 760.302125] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 760.303145] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f691fdc0-4f39-48ab-80c3-be7bed14e488 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.313934] env[59659]: DEBUG nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 760.319918] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbaabfd9-e0db-4ff9-915a-04376d3d3ba3 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.349618] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance abec9f87-4cde-4b5e-ad2a-fa682842ac7a could not be found. [ 760.350150] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 760.350150] env[59659]: INFO nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Took 0.05 seconds to destroy the instance on the hypervisor. [ 760.350297] env[59659]: DEBUG oslo.service.loopingcall [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 760.350659] env[59659]: DEBUG nova.compute.manager [-] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 760.350659] env[59659]: DEBUG nova.network.neutron [-] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 760.385796] env[59659]: DEBUG oslo_concurrency.lockutils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.386355] env[59659]: DEBUG oslo_concurrency.lockutils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.387707] env[59659]: INFO nova.compute.claims [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 760.589845] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49b3cf5c-c54f-442a-8fc2-32487e67b7de {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.598952] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fb0ef17-ed46-4390-9908-eba1bf09f1aa {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.630980] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a5c1f17-d775-4869-ac2a-eed82479efc8 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.639675] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76f06503-bc4d-4cac-af50-2274d8c4349d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.656258] env[59659]: DEBUG nova.compute.provider_tree [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 760.660111] env[59659]: DEBUG nova.network.neutron [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.660111] env[59659]: DEBUG nova.network.neutron [-] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 760.669223] env[59659]: DEBUG nova.scheduler.client.report [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 760.673779] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Releasing lock "refresh_cache-1195d592-faa2-43d7-af58-12b75abd5ed0" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 760.673779] env[59659]: DEBUG nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 760.673779] env[59659]: DEBUG nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 760.673779] env[59659]: DEBUG nova.network.neutron [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 760.679813] env[59659]: DEBUG nova.network.neutron [-] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.692370] env[59659]: DEBUG oslo_concurrency.lockutils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.306s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 760.692887] env[59659]: DEBUG nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 760.696922] env[59659]: INFO nova.compute.manager [-] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Took 0.35 seconds to deallocate network for instance. [ 760.699169] env[59659]: DEBUG nova.compute.claims [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 760.699404] env[59659]: DEBUG oslo_concurrency.lockutils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 760.699539] env[59659]: DEBUG oslo_concurrency.lockutils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 760.754307] env[59659]: DEBUG nova.compute.utils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 760.760163] env[59659]: DEBUG nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 760.760163] env[59659]: DEBUG nova.network.neutron [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 760.767730] env[59659]: DEBUG nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 760.823518] env[59659]: DEBUG nova.network.neutron [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 760.838471] env[59659]: DEBUG nova.network.neutron [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 760.855362] env[59659]: DEBUG nova.policy [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '92c8ce085a514809aaf0fabb9982ccd2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f8f74d0c25be4992b4701c8838638ab3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 760.865838] env[59659]: DEBUG nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 760.876431] env[59659]: INFO nova.compute.manager [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] [instance: 1195d592-faa2-43d7-af58-12b75abd5ed0] Took 0.20 seconds to deallocate network for instance. [ 760.893027] env[59659]: DEBUG nova.virt.hardware [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-30T19:41:37Z,direct_url=,disk_format='vmdk',id=0fa786c9-f55e-46dc-b725-aa456ca9ff53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='3f4ccdf020b5413c9a4233eccb6b55a8',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-30T19:41:37Z,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 760.893027] env[59659]: DEBUG nova.virt.hardware [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 760.893027] env[59659]: DEBUG nova.virt.hardware [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 760.893401] env[59659]: DEBUG nova.virt.hardware [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 760.893401] env[59659]: DEBUG nova.virt.hardware [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 760.893401] env[59659]: DEBUG nova.virt.hardware [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 760.893401] env[59659]: DEBUG nova.virt.hardware [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 760.893521] env[59659]: DEBUG nova.virt.hardware [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 760.893711] env[59659]: DEBUG nova.virt.hardware [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 760.893879] env[59659]: DEBUG nova.virt.hardware [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 760.894054] env[59659]: DEBUG nova.virt.hardware [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 760.895229] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f68e86ef-5bb6-473d-80a7-ab2dbb304f85 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.899229] env[59659]: DEBUG nova.network.neutron [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Successfully created port: 560b15a6-3e21-4068-94d0-df0d6e201268 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 760.905022] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25114074-1054-4ac5-af02-4a1bf59d6fb6 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.957734] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c6e17a8-4cae-4587-8e0f-faf8b6946b1a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.978757] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba4436b5-1b27-4b4c-913c-341ba7a91958 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.021804] env[59659]: INFO nova.scheduler.client.report [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Deleted allocations for instance 1195d592-faa2-43d7-af58-12b75abd5ed0 [ 761.031023] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a4231c9-dc79-4a2c-8ed0-effbda1a16c3 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.043659] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fe8bab4-888e-4f20-98a6-04a0d0bc517a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.053697] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8908f05b-872f-4666-993f-86042f4b27f5 tempest-ServerActionsTestJSON-335386642 tempest-ServerActionsTestJSON-335386642-project-member] Lock "1195d592-faa2-43d7-af58-12b75abd5ed0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.409s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.063573] env[59659]: DEBUG nova.compute.provider_tree [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 761.079298] env[59659]: DEBUG nova.scheduler.client.report [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 761.104207] env[59659]: DEBUG oslo_concurrency.lockutils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.404s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 761.104899] env[59659]: ERROR nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port f3f7985f-2ffb-489b-bf50-5a30759b413b, please check neutron logs for more information. [ 761.104899] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Traceback (most recent call last): [ 761.104899] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 761.104899] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] self.driver.spawn(context, instance, image_meta, [ 761.104899] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 761.104899] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 761.104899] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 761.104899] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] vm_ref = self.build_virtual_machine(instance, [ 761.104899] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 761.104899] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] vif_infos = vmwarevif.get_vif_info(self._session, [ 761.104899] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 761.105260] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] for vif in network_info: [ 761.105260] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 761.105260] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] return self._sync_wrapper(fn, *args, **kwargs) [ 761.105260] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 761.105260] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] self.wait() [ 761.105260] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 761.105260] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] self[:] = self._gt.wait() [ 761.105260] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 761.105260] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] return self._exit_event.wait() [ 761.105260] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 761.105260] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] result = hub.switch() [ 761.105260] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 761.105260] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] return self.greenlet.switch() [ 761.105260] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 761.105606] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] result = function(*args, **kwargs) [ 761.105606] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 761.105606] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] return func(*args, **kwargs) [ 761.105606] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 761.105606] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] raise e [ 761.105606] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 761.105606] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] nwinfo = self.network_api.allocate_for_instance( [ 761.105606] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 761.105606] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] created_port_ids = self._update_ports_for_instance( [ 761.105606] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 761.105606] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] with excutils.save_and_reraise_exception(): [ 761.105606] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 761.105606] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] self.force_reraise() [ 761.105984] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 761.105984] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] raise self.value [ 761.105984] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 761.105984] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] updated_port = self._update_port( [ 761.105984] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 761.105984] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] _ensure_no_port_binding_failure(port) [ 761.105984] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 761.105984] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] raise exception.PortBindingFailed(port_id=port['id']) [ 761.105984] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] nova.exception.PortBindingFailed: Binding failed for port f3f7985f-2ffb-489b-bf50-5a30759b413b, please check neutron logs for more information. [ 761.105984] env[59659]: ERROR nova.compute.manager [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] [ 761.105984] env[59659]: DEBUG nova.compute.utils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Binding failed for port f3f7985f-2ffb-489b-bf50-5a30759b413b, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 761.108025] env[59659]: WARNING oslo_vmware.rw_handles [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 761.108025] env[59659]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 761.108025] env[59659]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 761.108025] env[59659]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 761.108025] env[59659]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 761.108025] env[59659]: ERROR oslo_vmware.rw_handles response.begin() [ 761.108025] env[59659]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 761.108025] env[59659]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 761.108025] env[59659]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 761.108025] env[59659]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 761.108025] env[59659]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 761.108025] env[59659]: ERROR oslo_vmware.rw_handles [ 761.108568] env[59659]: DEBUG nova.virt.vmwareapi.images [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Downloaded image file data 0fa786c9-f55e-46dc-b725-aa456ca9ff53 to vmware_temp/502a1631-b8ed-4837-9429-7e90b81eeee6/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59659) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 761.112131] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Caching image {{(pid=59659) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 761.112414] env[59659]: DEBUG nova.virt.vmwareapi.vm_util [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Copying Virtual Disk [datastore2] vmware_temp/502a1631-b8ed-4837-9429-7e90b81eeee6/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk to [datastore2] vmware_temp/502a1631-b8ed-4837-9429-7e90b81eeee6/0fa786c9-f55e-46dc-b725-aa456ca9ff53/0fa786c9-f55e-46dc-b725-aa456ca9ff53.vmdk {{(pid=59659) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 761.112969] env[59659]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-96b4e5ef-c00a-45bf-bb4f-4bfa062c28bb {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.116060] env[59659]: DEBUG nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Build of instance abec9f87-4cde-4b5e-ad2a-fa682842ac7a was re-scheduled: Binding failed for port f3f7985f-2ffb-489b-bf50-5a30759b413b, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 761.116503] env[59659]: DEBUG nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 761.116722] env[59659]: DEBUG oslo_concurrency.lockutils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquiring lock "refresh_cache-abec9f87-4cde-4b5e-ad2a-fa682842ac7a" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 761.116867] env[59659]: DEBUG oslo_concurrency.lockutils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquired lock "refresh_cache-abec9f87-4cde-4b5e-ad2a-fa682842ac7a" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 761.117036] env[59659]: DEBUG nova.network.neutron [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 761.125435] env[59659]: DEBUG oslo_vmware.api [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Waiting for the task: (returnval){ [ 761.125435] env[59659]: value = "task-1384553" [ 761.125435] env[59659]: _type = "Task" [ 761.125435] env[59659]: } to complete. {{(pid=59659) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 761.141544] env[59659]: DEBUG oslo_vmware.api [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Task: {'id': task-1384553, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 761.202656] env[59659]: DEBUG nova.network.neutron [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Successfully created port: 919aae49-1ef4-4d7f-a76b-82c6e8107512 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 761.230019] env[59659]: DEBUG nova.network.neutron [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 761.518273] env[59659]: DEBUG nova.network.neutron [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Successfully created port: 7f2b0be1-4d42-4abc-a751-d4b312b5c4a5 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 761.648165] env[59659]: DEBUG oslo_vmware.exceptions [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Fault InvalidArgument not matched. {{(pid=59659) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 761.648629] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Releasing lock "[datastore2] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53/0fa786c9-f55e-46dc-b725-aa456ca9ff53.vmdk" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 761.652165] env[59659]: ERROR nova.compute.manager [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 761.652165] env[59659]: Faults: ['InvalidArgument'] [ 761.652165] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Traceback (most recent call last): [ 761.652165] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 761.652165] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] yield resources [ 761.652165] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 761.652165] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] self.driver.spawn(context, instance, image_meta, [ 761.652165] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 761.652165] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 761.652165] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 761.652165] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] self._fetch_image_if_missing(context, vi) [ 761.652165] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 761.652582] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] image_cache(vi, tmp_image_ds_loc) [ 761.652582] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 761.652582] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] vm_util.copy_virtual_disk( [ 761.652582] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 761.652582] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] session._wait_for_task(vmdk_copy_task) [ 761.652582] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 761.652582] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] return self.wait_for_task(task_ref) [ 761.652582] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 761.652582] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] return evt.wait() [ 761.652582] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 761.652582] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] result = hub.switch() [ 761.652582] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 761.652582] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] return self.greenlet.switch() [ 761.652942] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 761.652942] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] self.f(*self.args, **self.kw) [ 761.652942] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 761.652942] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] raise exceptions.translate_fault(task_info.error) [ 761.652942] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 761.652942] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Faults: ['InvalidArgument'] [ 761.652942] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] [ 761.652942] env[59659]: INFO nova.compute.manager [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Terminating instance [ 761.652942] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Acquiring lock "refresh_cache-ea968312-62ea-4f55-87e9-f91823fc14c2" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 761.653264] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Acquired lock "refresh_cache-ea968312-62ea-4f55-87e9-f91823fc14c2" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 761.653264] env[59659]: DEBUG nova.network.neutron [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 761.702484] env[59659]: DEBUG nova.network.neutron [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 761.792622] env[59659]: DEBUG nova.network.neutron [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 761.812532] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Releasing lock "refresh_cache-ea968312-62ea-4f55-87e9-f91823fc14c2" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 761.812937] env[59659]: DEBUG nova.compute.manager [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 761.813147] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 761.814247] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f20572d-1464-48ed-9fe0-95cd5bbc7352 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.825752] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Unregistering the VM {{(pid=59659) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 761.826091] env[59659]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0179e595-eb58-4579-b9c1-1d3db4f6acbb {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.860212] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Unregistered the VM {{(pid=59659) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 761.860425] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Deleting contents of the VM from datastore datastore2 {{(pid=59659) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 761.860593] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Deleting the datastore file [datastore2] ea968312-62ea-4f55-87e9-f91823fc14c2 {{(pid=59659) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 761.860853] env[59659]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a993e654-dd72-43e8-a028-b4ecffab1989 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 761.873832] env[59659]: DEBUG oslo_vmware.api [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Waiting for the task: (returnval){ [ 761.873832] env[59659]: value = "task-1384555" [ 761.873832] env[59659]: _type = "Task" [ 761.873832] env[59659]: } to complete. {{(pid=59659) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 761.886875] env[59659]: DEBUG oslo_vmware.api [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Task: {'id': task-1384555, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 762.170177] env[59659]: DEBUG nova.network.neutron [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.179591] env[59659]: DEBUG oslo_concurrency.lockutils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Releasing lock "refresh_cache-abec9f87-4cde-4b5e-ad2a-fa682842ac7a" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 762.179880] env[59659]: DEBUG nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 762.180080] env[59659]: DEBUG nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 762.180242] env[59659]: DEBUG nova.network.neutron [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 762.271568] env[59659]: DEBUG nova.network.neutron [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 762.282386] env[59659]: DEBUG nova.network.neutron [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.286905] env[59659]: DEBUG nova.network.neutron [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Successfully created port: 35682cb9-e9d7-4838-847e-d5bcc61f6753 {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 762.296160] env[59659]: INFO nova.compute.manager [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Took 0.12 seconds to deallocate network for instance. [ 762.389354] env[59659]: DEBUG oslo_vmware.api [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Task: {'id': task-1384555, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.042441} completed successfully. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 762.389570] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Deleted the datastore file {{(pid=59659) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 762.390078] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Deleted contents of the VM from datastore datastore2 {{(pid=59659) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 762.390204] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 762.390331] env[59659]: INFO nova.compute.manager [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Took 0.58 seconds to destroy the instance on the hypervisor. [ 762.390566] env[59659]: DEBUG oslo.service.loopingcall [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 762.390773] env[59659]: DEBUG nova.compute.manager [-] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Skipping network deallocation for instance since networking was not requested. {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 762.396785] env[59659]: DEBUG nova.compute.claims [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 762.396785] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 762.397126] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 762.405126] env[59659]: INFO nova.scheduler.client.report [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Deleted allocations for instance abec9f87-4cde-4b5e-ad2a-fa682842ac7a [ 762.438989] env[59659]: DEBUG oslo_concurrency.lockutils [None req-77dcc848-a5f0-4c0e-92f4-b129eec2903d tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "abec9f87-4cde-4b5e-ad2a-fa682842ac7a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.742s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 762.439311] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "abec9f87-4cde-4b5e-ad2a-fa682842ac7a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 12.097s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 762.439576] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquiring lock "abec9f87-4cde-4b5e-ad2a-fa682842ac7a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 762.439818] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "abec9f87-4cde-4b5e-ad2a-fa682842ac7a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 762.440067] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "abec9f87-4cde-4b5e-ad2a-fa682842ac7a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 762.448358] env[59659]: INFO nova.compute.manager [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Terminating instance [ 762.450879] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquiring lock "refresh_cache-abec9f87-4cde-4b5e-ad2a-fa682842ac7a" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 762.451507] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Acquired lock "refresh_cache-abec9f87-4cde-4b5e-ad2a-fa682842ac7a" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 762.451744] env[59659]: DEBUG nova.network.neutron [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 762.555469] env[59659]: DEBUG nova.network.neutron [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 762.579961] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb44d6dc-a2e4-4e4a-82a6-c12bdf7fc407 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.588832] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a37c4476-3a0f-41da-bb60-4a6b2da18034 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.623242] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d02379f6-b8dd-4f5e-aaac-c0c5e558fd72 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.633510] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64a33106-6d47-41fa-91e9-8bb3ef804dde {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.649468] env[59659]: DEBUG nova.compute.provider_tree [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 762.659125] env[59659]: DEBUG nova.scheduler.client.report [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 762.683312] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.285s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 762.683312] env[59659]: ERROR nova.compute.manager [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 762.683312] env[59659]: Faults: ['InvalidArgument'] [ 762.683312] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Traceback (most recent call last): [ 762.683312] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 762.683312] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] self.driver.spawn(context, instance, image_meta, [ 762.683312] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 762.683312] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 762.683312] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 762.683312] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] self._fetch_image_if_missing(context, vi) [ 762.683696] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 762.683696] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] image_cache(vi, tmp_image_ds_loc) [ 762.683696] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 762.683696] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] vm_util.copy_virtual_disk( [ 762.683696] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 762.683696] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] session._wait_for_task(vmdk_copy_task) [ 762.683696] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 762.683696] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] return self.wait_for_task(task_ref) [ 762.683696] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 762.683696] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] return evt.wait() [ 762.683696] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 762.683696] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] result = hub.switch() [ 762.683696] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 762.684045] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] return self.greenlet.switch() [ 762.684045] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 762.684045] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] self.f(*self.args, **self.kw) [ 762.684045] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 762.684045] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] raise exceptions.translate_fault(task_info.error) [ 762.684045] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 762.684045] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Faults: ['InvalidArgument'] [ 762.684045] env[59659]: ERROR nova.compute.manager [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] [ 762.684045] env[59659]: DEBUG nova.compute.utils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] VimFaultException {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 762.685123] env[59659]: DEBUG nova.compute.manager [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Build of instance ea968312-62ea-4f55-87e9-f91823fc14c2 was re-scheduled: A specified parameter was not correct: fileType [ 762.685123] env[59659]: Faults: ['InvalidArgument'] {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 762.686253] env[59659]: DEBUG nova.compute.manager [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 762.686253] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Acquiring lock "refresh_cache-ea968312-62ea-4f55-87e9-f91823fc14c2" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 762.686253] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Acquired lock "refresh_cache-ea968312-62ea-4f55-87e9-f91823fc14c2" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 762.686396] env[59659]: DEBUG nova.network.neutron [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 762.725563] env[59659]: DEBUG nova.network.neutron [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 762.839017] env[59659]: DEBUG nova.network.neutron [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 762.848895] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Releasing lock "refresh_cache-ea968312-62ea-4f55-87e9-f91823fc14c2" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 762.849146] env[59659]: DEBUG nova.compute.manager [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 762.849319] env[59659]: DEBUG nova.compute.manager [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] [instance: ea968312-62ea-4f55-87e9-f91823fc14c2] Skipping network deallocation for instance since networking was not requested. {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 762.954904] env[59659]: INFO nova.scheduler.client.report [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Deleted allocations for instance ea968312-62ea-4f55-87e9-f91823fc14c2 [ 762.970461] env[59659]: DEBUG oslo_concurrency.lockutils [None req-20524171-0491-451c-acd6-bf4548e093fb tempest-ServerShowV257Test-1565272682 tempest-ServerShowV257Test-1565272682-project-member] Lock "ea968312-62ea-4f55-87e9-f91823fc14c2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 52.473s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 763.105498] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Acquiring lock "f31b7ca3-60d1-4206-ac49-f85ec6194f85" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 763.105862] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Lock "f31b7ca3-60d1-4206-ac49-f85ec6194f85" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 763.119055] env[59659]: DEBUG nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Starting instance... {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 763.172349] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 763.172615] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 763.174167] env[59659]: INFO nova.compute.claims [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 763.312756] env[59659]: ERROR nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 7f2b0be1-4d42-4abc-a751-d4b312b5c4a5, please check neutron logs for more information. [ 763.312756] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 763.312756] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 763.312756] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 763.312756] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 763.312756] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 763.312756] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 763.312756] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 763.312756] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 763.312756] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 763.312756] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 763.312756] env[59659]: ERROR nova.compute.manager raise self.value [ 763.312756] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 763.312756] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 763.312756] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 763.312756] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 763.313231] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 763.313231] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 763.313231] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 7f2b0be1-4d42-4abc-a751-d4b312b5c4a5, please check neutron logs for more information. [ 763.313231] env[59659]: ERROR nova.compute.manager [ 763.313231] env[59659]: Traceback (most recent call last): [ 763.313231] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 763.313231] env[59659]: listener.cb(fileno) [ 763.313231] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 763.313231] env[59659]: result = function(*args, **kwargs) [ 763.313231] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 763.313231] env[59659]: return func(*args, **kwargs) [ 763.313231] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 763.313231] env[59659]: raise e [ 763.313231] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 763.313231] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 763.313231] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 763.313231] env[59659]: created_port_ids = self._update_ports_for_instance( [ 763.313231] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 763.313231] env[59659]: with excutils.save_and_reraise_exception(): [ 763.313231] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 763.313231] env[59659]: self.force_reraise() [ 763.313231] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 763.313231] env[59659]: raise self.value [ 763.313231] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 763.313231] env[59659]: updated_port = self._update_port( [ 763.313231] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 763.313231] env[59659]: _ensure_no_port_binding_failure(port) [ 763.313231] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 763.313231] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 763.313952] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 7f2b0be1-4d42-4abc-a751-d4b312b5c4a5, please check neutron logs for more information. [ 763.313952] env[59659]: Removing descriptor: 12 [ 763.313952] env[59659]: ERROR nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 7f2b0be1-4d42-4abc-a751-d4b312b5c4a5, please check neutron logs for more information. [ 763.313952] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Traceback (most recent call last): [ 763.313952] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 763.313952] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] yield resources [ 763.313952] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 763.313952] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] self.driver.spawn(context, instance, image_meta, [ 763.313952] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 763.313952] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 763.313952] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 763.313952] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] vm_ref = self.build_virtual_machine(instance, [ 763.314362] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 763.314362] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] vif_infos = vmwarevif.get_vif_info(self._session, [ 763.314362] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 763.314362] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] for vif in network_info: [ 763.314362] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 763.314362] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] return self._sync_wrapper(fn, *args, **kwargs) [ 763.314362] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 763.314362] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] self.wait() [ 763.314362] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 763.314362] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] self[:] = self._gt.wait() [ 763.314362] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 763.314362] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] return self._exit_event.wait() [ 763.314362] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 763.314362] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] result = hub.switch() [ 763.314753] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 763.314753] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] return self.greenlet.switch() [ 763.314753] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 763.314753] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] result = function(*args, **kwargs) [ 763.314753] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 763.314753] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] return func(*args, **kwargs) [ 763.314753] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 763.314753] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] raise e [ 763.314753] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 763.314753] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] nwinfo = self.network_api.allocate_for_instance( [ 763.314753] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 763.314753] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] created_port_ids = self._update_ports_for_instance( [ 763.314753] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 763.315185] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] with excutils.save_and_reraise_exception(): [ 763.315185] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 763.315185] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] self.force_reraise() [ 763.315185] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 763.315185] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] raise self.value [ 763.315185] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 763.315185] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] updated_port = self._update_port( [ 763.315185] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 763.315185] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] _ensure_no_port_binding_failure(port) [ 763.315185] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 763.315185] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] raise exception.PortBindingFailed(port_id=port['id']) [ 763.315185] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] nova.exception.PortBindingFailed: Binding failed for port 7f2b0be1-4d42-4abc-a751-d4b312b5c4a5, please check neutron logs for more information. [ 763.315185] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] [ 763.315590] env[59659]: INFO nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Terminating instance [ 763.323472] env[59659]: DEBUG oslo_concurrency.lockutils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Acquiring lock "refresh_cache-2d63a2a4-b912-487e-aa10-9e68d877baab" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 763.323472] env[59659]: DEBUG oslo_concurrency.lockutils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Acquired lock "refresh_cache-2d63a2a4-b912-487e-aa10-9e68d877baab" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 763.323472] env[59659]: DEBUG nova.network.neutron [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 763.341178] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3e4bb34-2d8a-4e38-b0ab-7439881771d6 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.350361] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d20c87ea-a5bc-4345-a400-7581d8818d85 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.386835] env[59659]: DEBUG nova.network.neutron [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 763.389636] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a239a8a-457f-4a12-a57c-32b1514d58d2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.396021] env[59659]: DEBUG nova.network.neutron [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 763.400137] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1098ffca-bf9d-4fe8-a37d-68b9ecee0e51 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.404980] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Releasing lock "refresh_cache-abec9f87-4cde-4b5e-ad2a-fa682842ac7a" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 763.405366] env[59659]: DEBUG nova.compute.manager [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 763.405585] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 763.406406] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3d51fa38-1261-4547-b99d-27c23c03e2b3 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.417995] env[59659]: DEBUG nova.compute.provider_tree [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 763.422918] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4c28110-5289-495d-ab83-2465df790fac {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.434303] env[59659]: DEBUG nova.scheduler.client.report [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 763.454890] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance abec9f87-4cde-4b5e-ad2a-fa682842ac7a could not be found. [ 763.455112] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 763.455295] env[59659]: INFO nova.compute.manager [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Took 0.05 seconds to destroy the instance on the hypervisor. [ 763.455533] env[59659]: DEBUG oslo.service.loopingcall [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 763.456469] env[59659]: DEBUG nova.compute.manager [-] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 763.456572] env[59659]: DEBUG nova.network.neutron [-] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 763.458590] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 763.458920] env[59659]: DEBUG nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Start building networks asynchronously for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 763.493533] env[59659]: DEBUG nova.compute.utils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Using /dev/sd instead of None {{(pid=59659) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 763.499037] env[59659]: DEBUG nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Allocating IP information in the background. {{(pid=59659) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 763.499037] env[59659]: DEBUG nova.network.neutron [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] allocate_for_instance() {{(pid=59659) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 763.505461] env[59659]: DEBUG nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Start building block device mappings for instance. {{(pid=59659) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 763.532613] env[59659]: DEBUG nova.network.neutron [-] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 763.538728] env[59659]: DEBUG nova.network.neutron [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 763.541194] env[59659]: INFO nova.virt.block_device [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Booting with volume edd23c41-bc61-47c0-9e80-412513b26303 at /dev/sda [ 763.543380] env[59659]: DEBUG nova.network.neutron [-] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 763.554357] env[59659]: DEBUG oslo_concurrency.lockutils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Releasing lock "refresh_cache-2d63a2a4-b912-487e-aa10-9e68d877baab" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 763.554800] env[59659]: DEBUG nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 763.554962] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 763.555697] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-763e006a-0519-48f5-9a2c-f53f96ad90f3 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.558184] env[59659]: INFO nova.compute.manager [-] [instance: abec9f87-4cde-4b5e-ad2a-fa682842ac7a] Took 0.10 seconds to deallocate network for instance. [ 763.568161] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6727166-86f1-4972-9e82-3b48173ac7c4 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.585417] env[59659]: DEBUG nova.policy [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd62aa2126d64e81a565bc0d29c7f13d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd099026ef14c4a9ea51a62cb8dc15673', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59659) authorize /opt/stack/nova/nova/policy.py:203}} [ 763.588468] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9bcf1838-7af0-4fcc-8c2c-4f1e8775dd2a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.599902] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab8f5fc4-9425-4175-893e-2d7265846c7e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.620110] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2d63a2a4-b912-487e-aa10-9e68d877baab could not be found. [ 763.620343] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 763.620514] env[59659]: INFO nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Took 0.07 seconds to destroy the instance on the hypervisor. [ 763.620755] env[59659]: DEBUG oslo.service.loopingcall [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 763.624043] env[59659]: DEBUG nova.compute.manager [-] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 763.624150] env[59659]: DEBUG nova.network.neutron [-] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 763.638917] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9f0e8fea-83a3-46f2-a115-3c546a187644 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.649648] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c09af6c-d5eb-4b4e-add6-ebaaaaae7e72 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.663039] env[59659]: DEBUG nova.network.neutron [-] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 763.671686] env[59659]: DEBUG nova.network.neutron [-] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 763.685027] env[59659]: INFO nova.compute.manager [-] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Took 0.06 seconds to deallocate network for instance. [ 763.685481] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6eeb35fd-5e70-42fe-8029-cbb78ab05bc2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.690995] env[59659]: DEBUG nova.compute.claims [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 763.691184] env[59659]: DEBUG oslo_concurrency.lockutils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 763.691399] env[59659]: DEBUG oslo_concurrency.lockutils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 763.700409] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4025e0e-8b37-4fbe-b2ad-cbc3b61f087a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.716187] env[59659]: DEBUG nova.virt.block_device [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Updating existing volume attachment record: 494fa8ca-c212-4820-824a-b6c7e29bbe82 {{(pid=59659) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 763.730519] env[59659]: DEBUG oslo_concurrency.lockutils [None req-5efc2d9d-ed13-4a66-91cf-0c348fc8d21a tempest-DeleteServersTestJSON-1071653592 tempest-DeleteServersTestJSON-1071653592-project-member] Lock "abec9f87-4cde-4b5e-ad2a-fa682842ac7a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.291s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 763.882575] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5159806-c44d-4bb2-b012-a901e4bcc854 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.892650] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3deb14ad-8d53-4612-b0c2-f74fbc0d1913 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.932309] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1d69bd8-f651-4ddd-85d0-cf8fee097798 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.938114] env[59659]: DEBUG nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Start spawning the instance on the hypervisor. {{(pid=59659) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 763.938541] env[59659]: DEBUG nova.virt.hardware [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-30T19:41:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 763.938745] env[59659]: DEBUG nova.virt.hardware [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Flavor limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 763.938888] env[59659]: DEBUG nova.virt.hardware [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Image limits 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 763.939071] env[59659]: DEBUG nova.virt.hardware [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Flavor pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 763.939208] env[59659]: DEBUG nova.virt.hardware [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Image pref 0:0:0 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 763.939345] env[59659]: DEBUG nova.virt.hardware [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59659) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 763.939552] env[59659]: DEBUG nova.virt.hardware [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 763.939698] env[59659]: DEBUG nova.virt.hardware [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 763.939857] env[59659]: DEBUG nova.virt.hardware [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Got 1 possible topologies {{(pid=59659) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 763.940020] env[59659]: DEBUG nova.virt.hardware [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 763.940190] env[59659]: DEBUG nova.virt.hardware [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59659) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 763.943828] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c47e40fb-2504-4c0a-bf47-eece782d5782 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.948529] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d97bbad-99ef-4210-9e17-55df257e7929 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.954095] env[59659]: DEBUG nova.network.neutron [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Successfully created port: db49ee90-852b-4303-8959-d6a2ac7694bd {{(pid=59659) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 763.970153] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1c5038f-a3fb-4c44-b6cd-bd55cc718df1 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.976044] env[59659]: DEBUG nova.compute.provider_tree [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 763.989807] env[59659]: DEBUG nova.scheduler.client.report [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 764.013408] env[59659]: DEBUG oslo_concurrency.lockutils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.318s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 764.013408] env[59659]: ERROR nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 7f2b0be1-4d42-4abc-a751-d4b312b5c4a5, please check neutron logs for more information. [ 764.013408] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Traceback (most recent call last): [ 764.013408] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 764.013408] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] self.driver.spawn(context, instance, image_meta, [ 764.013408] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 764.013408] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 764.013408] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 764.013408] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] vm_ref = self.build_virtual_machine(instance, [ 764.013835] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 764.013835] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] vif_infos = vmwarevif.get_vif_info(self._session, [ 764.013835] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 764.013835] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] for vif in network_info: [ 764.013835] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 764.013835] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] return self._sync_wrapper(fn, *args, **kwargs) [ 764.013835] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 764.013835] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] self.wait() [ 764.013835] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 764.013835] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] self[:] = self._gt.wait() [ 764.013835] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 764.013835] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] return self._exit_event.wait() [ 764.013835] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 764.013835] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] result = hub.switch() [ 764.014220] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 764.014220] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] return self.greenlet.switch() [ 764.014220] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 764.014220] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] result = function(*args, **kwargs) [ 764.014220] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 764.014220] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] return func(*args, **kwargs) [ 764.014220] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 764.014220] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] raise e [ 764.014220] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 764.014220] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] nwinfo = self.network_api.allocate_for_instance( [ 764.014220] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 764.014220] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] created_port_ids = self._update_ports_for_instance( [ 764.014220] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 764.014557] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] with excutils.save_and_reraise_exception(): [ 764.014557] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 764.014557] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] self.force_reraise() [ 764.014557] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 764.014557] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] raise self.value [ 764.014557] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 764.014557] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] updated_port = self._update_port( [ 764.014557] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 764.014557] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] _ensure_no_port_binding_failure(port) [ 764.014557] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 764.014557] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] raise exception.PortBindingFailed(port_id=port['id']) [ 764.014557] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] nova.exception.PortBindingFailed: Binding failed for port 7f2b0be1-4d42-4abc-a751-d4b312b5c4a5, please check neutron logs for more information. [ 764.014557] env[59659]: ERROR nova.compute.manager [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] [ 764.014922] env[59659]: DEBUG nova.compute.utils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Binding failed for port 7f2b0be1-4d42-4abc-a751-d4b312b5c4a5, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 764.014922] env[59659]: DEBUG nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Build of instance 2d63a2a4-b912-487e-aa10-9e68d877baab was re-scheduled: Binding failed for port 7f2b0be1-4d42-4abc-a751-d4b312b5c4a5, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 764.014922] env[59659]: DEBUG nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 764.014922] env[59659]: DEBUG oslo_concurrency.lockutils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Acquiring lock "refresh_cache-2d63a2a4-b912-487e-aa10-9e68d877baab" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 764.015087] env[59659]: DEBUG oslo_concurrency.lockutils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Acquired lock "refresh_cache-2d63a2a4-b912-487e-aa10-9e68d877baab" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 764.015087] env[59659]: DEBUG nova.network.neutron [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 764.055637] env[59659]: DEBUG nova.network.neutron [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 764.213986] env[59659]: DEBUG nova.network.neutron [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 764.223552] env[59659]: DEBUG oslo_concurrency.lockutils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Releasing lock "refresh_cache-2d63a2a4-b912-487e-aa10-9e68d877baab" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 764.223772] env[59659]: DEBUG nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 764.223951] env[59659]: DEBUG nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 764.224132] env[59659]: DEBUG nova.network.neutron [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 764.252701] env[59659]: DEBUG nova.network.neutron [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 764.260827] env[59659]: DEBUG nova.network.neutron [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 764.270209] env[59659]: INFO nova.compute.manager [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] [instance: 2d63a2a4-b912-487e-aa10-9e68d877baab] Took 0.05 seconds to deallocate network for instance. [ 764.359760] env[59659]: INFO nova.scheduler.client.report [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Deleted allocations for instance 2d63a2a4-b912-487e-aa10-9e68d877baab [ 764.377383] env[59659]: DEBUG oslo_concurrency.lockutils [None req-57ce834c-bcce-426a-ba4b-ceda0f1c5b75 tempest-ServersTestFqdnHostnames-512619947 tempest-ServersTestFqdnHostnames-512619947-project-member] Lock "2d63a2a4-b912-487e-aa10-9e68d877baab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 4.084s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 765.022746] env[59659]: ERROR nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port db49ee90-852b-4303-8959-d6a2ac7694bd, please check neutron logs for more information. [ 765.022746] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 765.022746] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 765.022746] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 765.022746] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 765.022746] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 765.022746] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 765.022746] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 765.022746] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 765.022746] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 765.022746] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 765.022746] env[59659]: ERROR nova.compute.manager raise self.value [ 765.022746] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 765.022746] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 765.022746] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 765.022746] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 765.023210] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 765.023210] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 765.023210] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port db49ee90-852b-4303-8959-d6a2ac7694bd, please check neutron logs for more information. [ 765.023210] env[59659]: ERROR nova.compute.manager [ 765.023210] env[59659]: Traceback (most recent call last): [ 765.023210] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 765.023210] env[59659]: listener.cb(fileno) [ 765.023210] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 765.023210] env[59659]: result = function(*args, **kwargs) [ 765.023210] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 765.023210] env[59659]: return func(*args, **kwargs) [ 765.023210] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 765.023210] env[59659]: raise e [ 765.023210] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 765.023210] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 765.023210] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 765.023210] env[59659]: created_port_ids = self._update_ports_for_instance( [ 765.023210] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 765.023210] env[59659]: with excutils.save_and_reraise_exception(): [ 765.023210] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 765.023210] env[59659]: self.force_reraise() [ 765.023210] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 765.023210] env[59659]: raise self.value [ 765.023210] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 765.023210] env[59659]: updated_port = self._update_port( [ 765.023210] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 765.023210] env[59659]: _ensure_no_port_binding_failure(port) [ 765.023210] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 765.023210] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 765.024022] env[59659]: nova.exception.PortBindingFailed: Binding failed for port db49ee90-852b-4303-8959-d6a2ac7694bd, please check neutron logs for more information. [ 765.024022] env[59659]: Removing descriptor: 22 [ 765.024022] env[59659]: ERROR nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port db49ee90-852b-4303-8959-d6a2ac7694bd, please check neutron logs for more information. [ 765.024022] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Traceback (most recent call last): [ 765.024022] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 765.024022] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] yield resources [ 765.024022] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 765.024022] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] self.driver.spawn(context, instance, image_meta, [ 765.024022] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 765.024022] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] self._vmops.spawn(context, instance, image_meta, injected_files, [ 765.024022] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 765.024022] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] vm_ref = self.build_virtual_machine(instance, [ 765.024362] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 765.024362] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] vif_infos = vmwarevif.get_vif_info(self._session, [ 765.024362] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 765.024362] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] for vif in network_info: [ 765.024362] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 765.024362] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] return self._sync_wrapper(fn, *args, **kwargs) [ 765.024362] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 765.024362] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] self.wait() [ 765.024362] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 765.024362] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] self[:] = self._gt.wait() [ 765.024362] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 765.024362] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] return self._exit_event.wait() [ 765.024362] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 765.024362] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] result = hub.switch() [ 765.024763] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 765.024763] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] return self.greenlet.switch() [ 765.024763] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 765.024763] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] result = function(*args, **kwargs) [ 765.024763] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 765.024763] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] return func(*args, **kwargs) [ 765.024763] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 765.024763] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] raise e [ 765.024763] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 765.024763] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] nwinfo = self.network_api.allocate_for_instance( [ 765.024763] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 765.024763] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] created_port_ids = self._update_ports_for_instance( [ 765.024763] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 765.025123] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] with excutils.save_and_reraise_exception(): [ 765.025123] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 765.025123] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] self.force_reraise() [ 765.025123] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 765.025123] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] raise self.value [ 765.025123] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 765.025123] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] updated_port = self._update_port( [ 765.025123] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 765.025123] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] _ensure_no_port_binding_failure(port) [ 765.025123] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 765.025123] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] raise exception.PortBindingFailed(port_id=port['id']) [ 765.025123] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] nova.exception.PortBindingFailed: Binding failed for port db49ee90-852b-4303-8959-d6a2ac7694bd, please check neutron logs for more information. [ 765.025123] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] [ 765.025512] env[59659]: INFO nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Terminating instance [ 765.027477] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Acquiring lock "refresh_cache-f31b7ca3-60d1-4206-ac49-f85ec6194f85" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 765.027725] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Acquired lock "refresh_cache-f31b7ca3-60d1-4206-ac49-f85ec6194f85" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 765.027893] env[59659]: DEBUG nova.network.neutron [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 765.056718] env[59659]: DEBUG nova.network.neutron [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 765.165183] env[59659]: DEBUG nova.network.neutron [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 765.175145] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Releasing lock "refresh_cache-f31b7ca3-60d1-4206-ac49-f85ec6194f85" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 765.175702] env[59659]: DEBUG nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 765.176242] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3076deb8-4dad-4774-ab36-2972a7063e32 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.190122] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f364a808-16bf-4889-b56e-c44b5772cb62 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.216484] env[59659]: WARNING nova.virt.vmwareapi.driver [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Instance does not exists. Proceeding to delete instance properties on datastore: nova.exception.InstanceNotFound: Instance f31b7ca3-60d1-4206-ac49-f85ec6194f85 could not be found. [ 765.216923] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 765.217128] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4673633f-a8fe-414b-a0cc-8dfc5e7070e0 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.225819] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-531756a2-7376-4bb2-b1bc-9a46a403c981 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.251620] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f31b7ca3-60d1-4206-ac49-f85ec6194f85 could not be found. [ 765.251821] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 765.251998] env[59659]: INFO nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Took 0.08 seconds to destroy the instance on the hypervisor. [ 765.252258] env[59659]: DEBUG oslo.service.loopingcall [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 765.252888] env[59659]: DEBUG nova.compute.manager [-] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 765.252980] env[59659]: DEBUG nova.network.neutron [-] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 765.272011] env[59659]: DEBUG nova.network.neutron [-] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 765.279822] env[59659]: DEBUG nova.network.neutron [-] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 765.289116] env[59659]: INFO nova.compute.manager [-] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Took 0.04 seconds to deallocate network for instance. [ 765.365858] env[59659]: INFO nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Took 0.08 seconds to detach 1 volumes for instance. [ 765.367986] env[59659]: DEBUG nova.compute.claims [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 765.368173] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 765.368381] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 765.514597] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d83034e-3640-4bcc-b26c-622d17801bc8 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.529846] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63129a93-a6c2-410e-bfb0-37bb2a4f2aee {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.566615] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b185452-9ba7-48bf-8311-f77fb81a021e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.575869] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9a761a1-c223-4bd7-976e-1cafda237497 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.591980] env[59659]: DEBUG nova.compute.provider_tree [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 765.602354] env[59659]: DEBUG nova.scheduler.client.report [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 765.621729] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.253s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 765.622820] env[59659]: ERROR nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port db49ee90-852b-4303-8959-d6a2ac7694bd, please check neutron logs for more information. [ 765.622820] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Traceback (most recent call last): [ 765.622820] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 765.622820] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] self.driver.spawn(context, instance, image_meta, [ 765.622820] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 765.622820] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] self._vmops.spawn(context, instance, image_meta, injected_files, [ 765.622820] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 765.622820] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] vm_ref = self.build_virtual_machine(instance, [ 765.622820] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 765.622820] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] vif_infos = vmwarevif.get_vif_info(self._session, [ 765.622820] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 765.623234] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] for vif in network_info: [ 765.623234] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 765.623234] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] return self._sync_wrapper(fn, *args, **kwargs) [ 765.623234] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 765.623234] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] self.wait() [ 765.623234] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 765.623234] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] self[:] = self._gt.wait() [ 765.623234] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 765.623234] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] return self._exit_event.wait() [ 765.623234] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 765.623234] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] result = hub.switch() [ 765.623234] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 765.623234] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] return self.greenlet.switch() [ 765.623234] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 765.623600] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] result = function(*args, **kwargs) [ 765.623600] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 765.623600] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] return func(*args, **kwargs) [ 765.623600] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 765.623600] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] raise e [ 765.623600] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 765.623600] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] nwinfo = self.network_api.allocate_for_instance( [ 765.623600] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 765.623600] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] created_port_ids = self._update_ports_for_instance( [ 765.623600] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 765.623600] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] with excutils.save_and_reraise_exception(): [ 765.623600] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 765.623600] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] self.force_reraise() [ 765.623951] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 765.623951] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] raise self.value [ 765.623951] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 765.623951] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] updated_port = self._update_port( [ 765.623951] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 765.623951] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] _ensure_no_port_binding_failure(port) [ 765.623951] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 765.623951] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] raise exception.PortBindingFailed(port_id=port['id']) [ 765.623951] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] nova.exception.PortBindingFailed: Binding failed for port db49ee90-852b-4303-8959-d6a2ac7694bd, please check neutron logs for more information. [ 765.623951] env[59659]: ERROR nova.compute.manager [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] [ 765.623951] env[59659]: DEBUG nova.compute.utils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Binding failed for port db49ee90-852b-4303-8959-d6a2ac7694bd, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 765.626053] env[59659]: DEBUG nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Build of instance f31b7ca3-60d1-4206-ac49-f85ec6194f85 was re-scheduled: Binding failed for port db49ee90-852b-4303-8959-d6a2ac7694bd, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 765.626442] env[59659]: DEBUG nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 765.626672] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Acquiring lock "refresh_cache-f31b7ca3-60d1-4206-ac49-f85ec6194f85" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 765.626818] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Acquired lock "refresh_cache-f31b7ca3-60d1-4206-ac49-f85ec6194f85" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 765.626968] env[59659]: DEBUG nova.network.neutron [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 765.678121] env[59659]: DEBUG nova.network.neutron [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 765.784038] env[59659]: DEBUG nova.network.neutron [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 765.793711] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Releasing lock "refresh_cache-f31b7ca3-60d1-4206-ac49-f85ec6194f85" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 765.793975] env[59659]: DEBUG nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 765.794143] env[59659]: DEBUG nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 765.794291] env[59659]: DEBUG nova.network.neutron [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 765.814366] env[59659]: DEBUG nova.network.neutron [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 765.824581] env[59659]: DEBUG nova.network.neutron [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 765.838223] env[59659]: INFO nova.compute.manager [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] [instance: f31b7ca3-60d1-4206-ac49-f85ec6194f85] Took 0.04 seconds to deallocate network for instance. [ 765.961976] env[59659]: INFO nova.scheduler.client.report [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Deleted allocations for instance f31b7ca3-60d1-4206-ac49-f85ec6194f85 [ 765.983814] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6c2ffaa1-e526-4f8f-8dd1-68bec88e766a tempest-ServerActionsV293TestJSON-1756577926 tempest-ServerActionsV293TestJSON-1756577926-project-member] Lock "f31b7ca3-60d1-4206-ac49-f85ec6194f85" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 2.878s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 768.563617] env[59659]: ERROR nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 560b15a6-3e21-4068-94d0-df0d6e201268, please check neutron logs for more information. [ 768.563617] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 768.563617] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 768.563617] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 768.563617] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 768.563617] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 768.563617] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 768.563617] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 768.563617] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 768.563617] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 768.563617] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 768.563617] env[59659]: ERROR nova.compute.manager raise self.value [ 768.563617] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 768.563617] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 768.563617] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 768.563617] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 768.564260] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 768.564260] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 768.564260] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 560b15a6-3e21-4068-94d0-df0d6e201268, please check neutron logs for more information. [ 768.564260] env[59659]: ERROR nova.compute.manager [ 768.564260] env[59659]: Traceback (most recent call last): [ 768.564260] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 768.564260] env[59659]: listener.cb(fileno) [ 768.564260] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 768.564260] env[59659]: result = function(*args, **kwargs) [ 768.564260] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 768.564260] env[59659]: return func(*args, **kwargs) [ 768.564260] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 768.564260] env[59659]: raise e [ 768.564260] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 768.564260] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 768.564260] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 768.564260] env[59659]: created_port_ids = self._update_ports_for_instance( [ 768.564260] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 768.564260] env[59659]: with excutils.save_and_reraise_exception(): [ 768.564260] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 768.564260] env[59659]: self.force_reraise() [ 768.564260] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 768.564260] env[59659]: raise self.value [ 768.564260] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 768.564260] env[59659]: updated_port = self._update_port( [ 768.564260] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 768.564260] env[59659]: _ensure_no_port_binding_failure(port) [ 768.564260] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 768.564260] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 768.565016] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 560b15a6-3e21-4068-94d0-df0d6e201268, please check neutron logs for more information. [ 768.565016] env[59659]: Removing descriptor: 14 [ 768.565016] env[59659]: ERROR nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 560b15a6-3e21-4068-94d0-df0d6e201268, please check neutron logs for more information. [ 768.565016] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Traceback (most recent call last): [ 768.565016] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 768.565016] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] yield resources [ 768.565016] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 768.565016] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] self.driver.spawn(context, instance, image_meta, [ 768.565016] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 768.565016] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] self._vmops.spawn(context, instance, image_meta, injected_files, [ 768.565016] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 768.565016] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] vm_ref = self.build_virtual_machine(instance, [ 768.565347] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 768.565347] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] vif_infos = vmwarevif.get_vif_info(self._session, [ 768.565347] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 768.565347] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] for vif in network_info: [ 768.565347] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 768.565347] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] return self._sync_wrapper(fn, *args, **kwargs) [ 768.565347] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 768.565347] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] self.wait() [ 768.565347] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 768.565347] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] self[:] = self._gt.wait() [ 768.565347] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 768.565347] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] return self._exit_event.wait() [ 768.565347] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 768.565347] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] result = hub.switch() [ 768.566369] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 768.566369] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] return self.greenlet.switch() [ 768.566369] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 768.566369] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] result = function(*args, **kwargs) [ 768.566369] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 768.566369] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] return func(*args, **kwargs) [ 768.566369] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 768.566369] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] raise e [ 768.566369] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 768.566369] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] nwinfo = self.network_api.allocate_for_instance( [ 768.566369] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 768.566369] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] created_port_ids = self._update_ports_for_instance( [ 768.566369] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 768.566715] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] with excutils.save_and_reraise_exception(): [ 768.566715] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 768.566715] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] self.force_reraise() [ 768.566715] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 768.566715] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] raise self.value [ 768.566715] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 768.566715] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] updated_port = self._update_port( [ 768.566715] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 768.566715] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] _ensure_no_port_binding_failure(port) [ 768.566715] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 768.566715] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] raise exception.PortBindingFailed(port_id=port['id']) [ 768.566715] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] nova.exception.PortBindingFailed: Binding failed for port 560b15a6-3e21-4068-94d0-df0d6e201268, please check neutron logs for more information. [ 768.566715] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] [ 768.567148] env[59659]: INFO nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Terminating instance [ 768.568876] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Acquiring lock "refresh_cache-0ed4be35-b845-48ca-b892-657d96c12728" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 768.569050] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Acquired lock "refresh_cache-0ed4be35-b845-48ca-b892-657d96c12728" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 768.569586] env[59659]: DEBUG nova.network.neutron [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 768.577392] env[59659]: ERROR nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 919aae49-1ef4-4d7f-a76b-82c6e8107512, please check neutron logs for more information. [ 768.577392] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 768.577392] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 768.577392] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 768.577392] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 768.577392] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 768.577392] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 768.577392] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 768.577392] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 768.577392] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 768.577392] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 768.577392] env[59659]: ERROR nova.compute.manager raise self.value [ 768.577392] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 768.577392] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 768.577392] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 768.577392] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 768.577923] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 768.577923] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 768.577923] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 919aae49-1ef4-4d7f-a76b-82c6e8107512, please check neutron logs for more information. [ 768.577923] env[59659]: ERROR nova.compute.manager [ 768.577923] env[59659]: Traceback (most recent call last): [ 768.577923] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 768.577923] env[59659]: listener.cb(fileno) [ 768.577923] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 768.577923] env[59659]: result = function(*args, **kwargs) [ 768.577923] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 768.577923] env[59659]: return func(*args, **kwargs) [ 768.577923] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 768.577923] env[59659]: raise e [ 768.577923] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 768.577923] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 768.577923] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 768.577923] env[59659]: created_port_ids = self._update_ports_for_instance( [ 768.577923] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 768.577923] env[59659]: with excutils.save_and_reraise_exception(): [ 768.577923] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 768.577923] env[59659]: self.force_reraise() [ 768.577923] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 768.577923] env[59659]: raise self.value [ 768.577923] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 768.577923] env[59659]: updated_port = self._update_port( [ 768.577923] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 768.577923] env[59659]: _ensure_no_port_binding_failure(port) [ 768.577923] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 768.577923] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 768.578732] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 919aae49-1ef4-4d7f-a76b-82c6e8107512, please check neutron logs for more information. [ 768.578732] env[59659]: Removing descriptor: 23 [ 768.579260] env[59659]: ERROR nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 919aae49-1ef4-4d7f-a76b-82c6e8107512, please check neutron logs for more information. [ 768.579260] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Traceback (most recent call last): [ 768.579260] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 768.579260] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] yield resources [ 768.579260] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 768.579260] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] self.driver.spawn(context, instance, image_meta, [ 768.579260] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 768.579260] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] self._vmops.spawn(context, instance, image_meta, injected_files, [ 768.579260] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 768.579260] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] vm_ref = self.build_virtual_machine(instance, [ 768.579260] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 768.579737] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] vif_infos = vmwarevif.get_vif_info(self._session, [ 768.579737] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 768.579737] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] for vif in network_info: [ 768.579737] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 768.579737] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] return self._sync_wrapper(fn, *args, **kwargs) [ 768.579737] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 768.579737] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] self.wait() [ 768.579737] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 768.579737] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] self[:] = self._gt.wait() [ 768.579737] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 768.579737] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] return self._exit_event.wait() [ 768.579737] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 768.579737] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] result = hub.switch() [ 768.579737] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 768.580126] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] return self.greenlet.switch() [ 768.580126] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 768.580126] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] result = function(*args, **kwargs) [ 768.580126] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 768.580126] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] return func(*args, **kwargs) [ 768.580126] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 768.580126] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] raise e [ 768.580126] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 768.580126] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] nwinfo = self.network_api.allocate_for_instance( [ 768.580126] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 768.580126] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] created_port_ids = self._update_ports_for_instance( [ 768.580126] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 768.580126] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] with excutils.save_and_reraise_exception(): [ 768.580517] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 768.580517] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] self.force_reraise() [ 768.580517] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 768.580517] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] raise self.value [ 768.580517] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 768.580517] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] updated_port = self._update_port( [ 768.580517] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 768.580517] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] _ensure_no_port_binding_failure(port) [ 768.580517] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 768.580517] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] raise exception.PortBindingFailed(port_id=port['id']) [ 768.580517] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] nova.exception.PortBindingFailed: Binding failed for port 919aae49-1ef4-4d7f-a76b-82c6e8107512, please check neutron logs for more information. [ 768.580517] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] [ 768.580882] env[59659]: INFO nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Terminating instance [ 768.583588] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Acquiring lock "refresh_cache-938a2016-8eaa-446a-b69c-3af59448d944" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 768.583770] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Acquired lock "refresh_cache-938a2016-8eaa-446a-b69c-3af59448d944" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 768.583910] env[59659]: DEBUG nova.network.neutron [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 768.636992] env[59659]: DEBUG nova.network.neutron [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 768.639807] env[59659]: DEBUG nova.network.neutron [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 769.200291] env[59659]: DEBUG nova.network.neutron [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 769.213770] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Releasing lock "refresh_cache-938a2016-8eaa-446a-b69c-3af59448d944" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 769.214598] env[59659]: DEBUG nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 769.214982] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 769.216073] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3e9e8d0b-24b1-4821-b390-afee07d58db0 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.222687] env[59659]: DEBUG nova.network.neutron [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 769.228502] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-082e8c79-d076-4bb7-ab70-9906e6ffcbd8 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.242501] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Releasing lock "refresh_cache-0ed4be35-b845-48ca-b892-657d96c12728" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 769.243042] env[59659]: DEBUG nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 769.243332] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 769.244333] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d6765ebb-62a3-4b02-ab8f-5923ad1c7a9f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.254061] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1982714e-d328-479e-9d83-1abf16d36060 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.272239] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 938a2016-8eaa-446a-b69c-3af59448d944 could not be found. [ 769.272579] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 769.272944] env[59659]: INFO nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Took 0.06 seconds to destroy the instance on the hypervisor. [ 769.273322] env[59659]: DEBUG oslo.service.loopingcall [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 769.274413] env[59659]: DEBUG nova.compute.manager [-] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 769.274619] env[59659]: DEBUG nova.network.neutron [-] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 769.288947] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0ed4be35-b845-48ca-b892-657d96c12728 could not be found. [ 769.289250] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 769.289487] env[59659]: INFO nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Took 0.05 seconds to destroy the instance on the hypervisor. [ 769.289800] env[59659]: DEBUG oslo.service.loopingcall [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 769.290099] env[59659]: DEBUG nova.compute.manager [-] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 769.290490] env[59659]: DEBUG nova.network.neutron [-] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 769.337305] env[59659]: DEBUG nova.network.neutron [-] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 769.339760] env[59659]: DEBUG nova.network.neutron [-] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 769.347069] env[59659]: DEBUG nova.network.neutron [-] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 769.348664] env[59659]: DEBUG nova.network.neutron [-] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 769.356943] env[59659]: INFO nova.compute.manager [-] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Took 0.07 seconds to deallocate network for instance. [ 769.360177] env[59659]: INFO nova.compute.manager [-] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Took 0.08 seconds to deallocate network for instance. [ 769.360177] env[59659]: DEBUG nova.compute.claims [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 769.360341] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 769.360755] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 769.364624] env[59659]: DEBUG nova.compute.claims [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 769.364876] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 769.372832] env[59659]: ERROR nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 35682cb9-e9d7-4838-847e-d5bcc61f6753, please check neutron logs for more information. [ 769.372832] env[59659]: ERROR nova.compute.manager Traceback (most recent call last): [ 769.372832] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 769.372832] env[59659]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 769.372832] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 769.372832] env[59659]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 769.372832] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 769.372832] env[59659]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 769.372832] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 769.372832] env[59659]: ERROR nova.compute.manager self.force_reraise() [ 769.372832] env[59659]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 769.372832] env[59659]: ERROR nova.compute.manager raise self.value [ 769.372832] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 769.372832] env[59659]: ERROR nova.compute.manager updated_port = self._update_port( [ 769.372832] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 769.372832] env[59659]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 769.373301] env[59659]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 769.373301] env[59659]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 769.373301] env[59659]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 35682cb9-e9d7-4838-847e-d5bcc61f6753, please check neutron logs for more information. [ 769.373301] env[59659]: ERROR nova.compute.manager [ 769.373301] env[59659]: Traceback (most recent call last): [ 769.373301] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 769.373301] env[59659]: listener.cb(fileno) [ 769.373301] env[59659]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 769.373301] env[59659]: result = function(*args, **kwargs) [ 769.373301] env[59659]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 769.373301] env[59659]: return func(*args, **kwargs) [ 769.373301] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 769.373301] env[59659]: raise e [ 769.373301] env[59659]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 769.373301] env[59659]: nwinfo = self.network_api.allocate_for_instance( [ 769.373301] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 769.373301] env[59659]: created_port_ids = self._update_ports_for_instance( [ 769.373301] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 769.373301] env[59659]: with excutils.save_and_reraise_exception(): [ 769.373301] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 769.373301] env[59659]: self.force_reraise() [ 769.373301] env[59659]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 769.373301] env[59659]: raise self.value [ 769.373301] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 769.373301] env[59659]: updated_port = self._update_port( [ 769.373301] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 769.373301] env[59659]: _ensure_no_port_binding_failure(port) [ 769.373301] env[59659]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 769.373301] env[59659]: raise exception.PortBindingFailed(port_id=port['id']) [ 769.374017] env[59659]: nova.exception.PortBindingFailed: Binding failed for port 35682cb9-e9d7-4838-847e-d5bcc61f6753, please check neutron logs for more information. [ 769.374017] env[59659]: Removing descriptor: 21 [ 769.374017] env[59659]: ERROR nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 35682cb9-e9d7-4838-847e-d5bcc61f6753, please check neutron logs for more information. [ 769.374017] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Traceback (most recent call last): [ 769.374017] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 769.374017] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] yield resources [ 769.374017] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 769.374017] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] self.driver.spawn(context, instance, image_meta, [ 769.374017] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 769.374017] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 769.374017] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 769.374017] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] vm_ref = self.build_virtual_machine(instance, [ 769.375098] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 769.375098] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] vif_infos = vmwarevif.get_vif_info(self._session, [ 769.375098] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 769.375098] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] for vif in network_info: [ 769.375098] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 769.375098] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] return self._sync_wrapper(fn, *args, **kwargs) [ 769.375098] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 769.375098] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] self.wait() [ 769.375098] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 769.375098] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] self[:] = self._gt.wait() [ 769.375098] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 769.375098] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] return self._exit_event.wait() [ 769.375098] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 769.375098] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] result = hub.switch() [ 769.375577] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 769.375577] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] return self.greenlet.switch() [ 769.375577] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 769.375577] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] result = function(*args, **kwargs) [ 769.375577] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 769.375577] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] return func(*args, **kwargs) [ 769.375577] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 769.375577] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] raise e [ 769.375577] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 769.375577] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] nwinfo = self.network_api.allocate_for_instance( [ 769.375577] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 769.375577] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] created_port_ids = self._update_ports_for_instance( [ 769.375577] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 769.375907] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] with excutils.save_and_reraise_exception(): [ 769.375907] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 769.375907] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] self.force_reraise() [ 769.375907] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 769.375907] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] raise self.value [ 769.375907] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 769.375907] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] updated_port = self._update_port( [ 769.375907] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 769.375907] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] _ensure_no_port_binding_failure(port) [ 769.375907] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 769.375907] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] raise exception.PortBindingFailed(port_id=port['id']) [ 769.375907] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] nova.exception.PortBindingFailed: Binding failed for port 35682cb9-e9d7-4838-847e-d5bcc61f6753, please check neutron logs for more information. [ 769.375907] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] [ 769.376655] env[59659]: INFO nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Terminating instance [ 769.377792] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Acquiring lock "refresh_cache-72a92098-562e-47bf-8dde-8b62b182d7bb" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 769.378025] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Acquired lock "refresh_cache-72a92098-562e-47bf-8dde-8b62b182d7bb" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 769.378401] env[59659]: DEBUG nova.network.neutron [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 769.420073] env[59659]: DEBUG nova.network.neutron [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 769.503930] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37778da6-3833-41d0-8845-ac182e9ce5a0 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.517367] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92a744f4-3677-4881-8568-278f5b6ed028 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.552811] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e432022b-f9af-4b14-bc44-dcb563722973 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.562270] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe9cedc8-6d98-4f47-ac6f-9b7bb3783d0f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.577073] env[59659]: DEBUG nova.compute.provider_tree [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 769.586700] env[59659]: DEBUG nova.scheduler.client.report [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 769.603978] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.243s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 769.604619] env[59659]: ERROR nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 560b15a6-3e21-4068-94d0-df0d6e201268, please check neutron logs for more information. [ 769.604619] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Traceback (most recent call last): [ 769.604619] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 769.604619] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] self.driver.spawn(context, instance, image_meta, [ 769.604619] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 769.604619] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] self._vmops.spawn(context, instance, image_meta, injected_files, [ 769.604619] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 769.604619] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] vm_ref = self.build_virtual_machine(instance, [ 769.604619] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 769.604619] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] vif_infos = vmwarevif.get_vif_info(self._session, [ 769.604619] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 769.604990] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] for vif in network_info: [ 769.604990] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 769.604990] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] return self._sync_wrapper(fn, *args, **kwargs) [ 769.604990] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 769.604990] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] self.wait() [ 769.604990] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 769.604990] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] self[:] = self._gt.wait() [ 769.604990] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 769.604990] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] return self._exit_event.wait() [ 769.604990] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 769.604990] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] result = hub.switch() [ 769.604990] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 769.604990] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] return self.greenlet.switch() [ 769.604990] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 769.605350] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] result = function(*args, **kwargs) [ 769.605350] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 769.605350] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] return func(*args, **kwargs) [ 769.605350] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 769.605350] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] raise e [ 769.605350] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 769.605350] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] nwinfo = self.network_api.allocate_for_instance( [ 769.605350] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 769.605350] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] created_port_ids = self._update_ports_for_instance( [ 769.605350] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 769.605350] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] with excutils.save_and_reraise_exception(): [ 769.605350] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 769.605350] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] self.force_reraise() [ 769.605654] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 769.605654] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] raise self.value [ 769.605654] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 769.605654] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] updated_port = self._update_port( [ 769.605654] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 769.605654] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] _ensure_no_port_binding_failure(port) [ 769.605654] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 769.605654] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] raise exception.PortBindingFailed(port_id=port['id']) [ 769.605654] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] nova.exception.PortBindingFailed: Binding failed for port 560b15a6-3e21-4068-94d0-df0d6e201268, please check neutron logs for more information. [ 769.605654] env[59659]: ERROR nova.compute.manager [instance: 0ed4be35-b845-48ca-b892-657d96c12728] [ 769.605911] env[59659]: DEBUG nova.compute.utils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Binding failed for port 560b15a6-3e21-4068-94d0-df0d6e201268, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 769.606793] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.242s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 769.609832] env[59659]: DEBUG nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Build of instance 0ed4be35-b845-48ca-b892-657d96c12728 was re-scheduled: Binding failed for port 560b15a6-3e21-4068-94d0-df0d6e201268, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 769.610357] env[59659]: DEBUG nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 769.610634] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Acquiring lock "refresh_cache-0ed4be35-b845-48ca-b892-657d96c12728" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 769.610828] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Acquired lock "refresh_cache-0ed4be35-b845-48ca-b892-657d96c12728" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 769.611042] env[59659]: DEBUG nova.network.neutron [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 769.677352] env[59659]: DEBUG nova.network.neutron [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 769.739906] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee0906cf-793c-4f50-8ebe-ba7b6bc8710e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.750182] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f16bca7-404d-467c-a202-594afed15235 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.780650] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33290ce9-eced-4ba1-a8ab-da4c2d536e22 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.789030] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d701b99b-048e-457c-b477-7378c5734adc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.797716] env[59659]: DEBUG nova.network.neutron [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 769.808025] env[59659]: DEBUG nova.compute.provider_tree [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 769.812253] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Releasing lock "refresh_cache-72a92098-562e-47bf-8dde-8b62b182d7bb" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 769.812253] env[59659]: DEBUG nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 769.812253] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 769.812253] env[59659]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-bfbd2785-904a-4cc6-b51a-6846ff3079c6 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.821479] env[59659]: DEBUG nova.scheduler.client.report [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 769.825212] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7a4fb9a-d9b5-45ca-b3f5-7d6cc5ca321e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.839277] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.232s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 769.839940] env[59659]: ERROR nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 919aae49-1ef4-4d7f-a76b-82c6e8107512, please check neutron logs for more information. [ 769.839940] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Traceback (most recent call last): [ 769.839940] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 769.839940] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] self.driver.spawn(context, instance, image_meta, [ 769.839940] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 769.839940] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] self._vmops.spawn(context, instance, image_meta, injected_files, [ 769.839940] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 769.839940] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] vm_ref = self.build_virtual_machine(instance, [ 769.839940] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 769.839940] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] vif_infos = vmwarevif.get_vif_info(self._session, [ 769.839940] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 769.840425] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] for vif in network_info: [ 769.840425] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 769.840425] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] return self._sync_wrapper(fn, *args, **kwargs) [ 769.840425] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 769.840425] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] self.wait() [ 769.840425] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 769.840425] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] self[:] = self._gt.wait() [ 769.840425] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 769.840425] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] return self._exit_event.wait() [ 769.840425] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 769.840425] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] result = hub.switch() [ 769.840425] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 769.840425] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] return self.greenlet.switch() [ 769.840425] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 769.840971] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] result = function(*args, **kwargs) [ 769.840971] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 769.840971] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] return func(*args, **kwargs) [ 769.840971] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 769.840971] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] raise e [ 769.840971] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 769.840971] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] nwinfo = self.network_api.allocate_for_instance( [ 769.840971] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 769.840971] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] created_port_ids = self._update_ports_for_instance( [ 769.840971] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 769.840971] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] with excutils.save_and_reraise_exception(): [ 769.840971] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 769.840971] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] self.force_reraise() [ 769.841313] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 769.841313] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] raise self.value [ 769.841313] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 769.841313] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] updated_port = self._update_port( [ 769.841313] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 769.841313] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] _ensure_no_port_binding_failure(port) [ 769.841313] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 769.841313] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] raise exception.PortBindingFailed(port_id=port['id']) [ 769.841313] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] nova.exception.PortBindingFailed: Binding failed for port 919aae49-1ef4-4d7f-a76b-82c6e8107512, please check neutron logs for more information. [ 769.841313] env[59659]: ERROR nova.compute.manager [instance: 938a2016-8eaa-446a-b69c-3af59448d944] [ 769.841313] env[59659]: DEBUG nova.compute.utils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Binding failed for port 919aae49-1ef4-4d7f-a76b-82c6e8107512, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 769.843029] env[59659]: DEBUG nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Build of instance 938a2016-8eaa-446a-b69c-3af59448d944 was re-scheduled: Binding failed for port 919aae49-1ef4-4d7f-a76b-82c6e8107512, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 769.843550] env[59659]: DEBUG nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 769.843768] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Acquiring lock "refresh_cache-938a2016-8eaa-446a-b69c-3af59448d944" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 769.843954] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Acquired lock "refresh_cache-938a2016-8eaa-446a-b69c-3af59448d944" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 769.844168] env[59659]: DEBUG nova.network.neutron [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 769.859430] env[59659]: WARNING nova.virt.vmwareapi.vmops [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 72a92098-562e-47bf-8dde-8b62b182d7bb could not be found. [ 769.859714] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 769.859941] env[59659]: INFO nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Took 0.05 seconds to destroy the instance on the hypervisor. [ 769.860232] env[59659]: DEBUG oslo.service.loopingcall [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 769.860787] env[59659]: DEBUG nova.compute.manager [-] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 769.860918] env[59659]: DEBUG nova.network.neutron [-] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 769.909771] env[59659]: DEBUG nova.network.neutron [-] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 769.911598] env[59659]: DEBUG nova.network.neutron [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 769.917390] env[59659]: DEBUG nova.network.neutron [-] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 769.931051] env[59659]: INFO nova.compute.manager [-] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Took 0.07 seconds to deallocate network for instance. [ 769.933233] env[59659]: DEBUG nova.compute.claims [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 769.935284] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 769.935284] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 770.094037] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca63930d-abd1-40cc-8f5e-c07024d2bdca {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 770.103418] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8aa08cc7-3f57-4637-8d54-6e37bcaf99f0 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 770.138381] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bdd317e-1bc1-4ccd-be57-310ce2c38d3b {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 770.148354] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-669f2bf2-f991-4263-b4a2-5c3f8fbfe256 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 770.166346] env[59659]: DEBUG nova.compute.provider_tree [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 770.174834] env[59659]: DEBUG nova.scheduler.client.report [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 770.195041] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.260s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 770.195041] env[59659]: ERROR nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 35682cb9-e9d7-4838-847e-d5bcc61f6753, please check neutron logs for more information. [ 770.195041] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Traceback (most recent call last): [ 770.195041] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 770.195041] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] self.driver.spawn(context, instance, image_meta, [ 770.195041] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 770.195041] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 770.195041] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 770.195041] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] vm_ref = self.build_virtual_machine(instance, [ 770.195456] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 770.195456] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] vif_infos = vmwarevif.get_vif_info(self._session, [ 770.195456] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 770.195456] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] for vif in network_info: [ 770.195456] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 770.195456] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] return self._sync_wrapper(fn, *args, **kwargs) [ 770.195456] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 770.195456] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] self.wait() [ 770.195456] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 770.195456] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] self[:] = self._gt.wait() [ 770.195456] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 770.195456] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] return self._exit_event.wait() [ 770.195456] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 770.195456] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] result = hub.switch() [ 770.195894] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 770.195894] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] return self.greenlet.switch() [ 770.195894] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 770.195894] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] result = function(*args, **kwargs) [ 770.195894] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 770.195894] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] return func(*args, **kwargs) [ 770.195894] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 770.195894] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] raise e [ 770.195894] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 770.195894] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] nwinfo = self.network_api.allocate_for_instance( [ 770.195894] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 770.195894] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] created_port_ids = self._update_ports_for_instance( [ 770.195894] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 770.196273] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] with excutils.save_and_reraise_exception(): [ 770.196273] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 770.196273] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] self.force_reraise() [ 770.196273] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 770.196273] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] raise self.value [ 770.196273] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 770.196273] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] updated_port = self._update_port( [ 770.196273] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 770.196273] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] _ensure_no_port_binding_failure(port) [ 770.196273] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 770.196273] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] raise exception.PortBindingFailed(port_id=port['id']) [ 770.196273] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] nova.exception.PortBindingFailed: Binding failed for port 35682cb9-e9d7-4838-847e-d5bcc61f6753, please check neutron logs for more information. [ 770.196273] env[59659]: ERROR nova.compute.manager [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] [ 770.198155] env[59659]: DEBUG nova.compute.utils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Binding failed for port 35682cb9-e9d7-4838-847e-d5bcc61f6753, please check neutron logs for more information. {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 770.198155] env[59659]: DEBUG nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Build of instance 72a92098-562e-47bf-8dde-8b62b182d7bb was re-scheduled: Binding failed for port 35682cb9-e9d7-4838-847e-d5bcc61f6753, please check neutron logs for more information. {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 770.198155] env[59659]: DEBUG nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 770.198155] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Acquiring lock "refresh_cache-72a92098-562e-47bf-8dde-8b62b182d7bb" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 770.198478] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Acquired lock "refresh_cache-72a92098-562e-47bf-8dde-8b62b182d7bb" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 770.198478] env[59659]: DEBUG nova.network.neutron [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 770.282489] env[59659]: DEBUG nova.network.neutron [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 770.322372] env[59659]: DEBUG nova.network.neutron [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 770.337016] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Releasing lock "refresh_cache-0ed4be35-b845-48ca-b892-657d96c12728" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 770.337016] env[59659]: DEBUG nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 770.337016] env[59659]: DEBUG nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 770.337016] env[59659]: DEBUG nova.network.neutron [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 770.416238] env[59659]: DEBUG nova.network.neutron [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 770.426668] env[59659]: DEBUG nova.network.neutron [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 770.440878] env[59659]: INFO nova.compute.manager [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] [instance: 0ed4be35-b845-48ca-b892-657d96c12728] Took 0.10 seconds to deallocate network for instance. [ 770.548420] env[59659]: DEBUG nova.network.neutron [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 770.560542] env[59659]: INFO nova.scheduler.client.report [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Deleted allocations for instance 0ed4be35-b845-48ca-b892-657d96c12728 [ 770.572241] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Releasing lock "refresh_cache-938a2016-8eaa-446a-b69c-3af59448d944" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 770.572462] env[59659]: DEBUG nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 770.572650] env[59659]: DEBUG nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 770.573092] env[59659]: DEBUG nova.network.neutron [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 770.586383] env[59659]: DEBUG oslo_concurrency.lockutils [None req-b211c439-48eb-45de-87a9-a1524f59bf9a tempest-ServerMetadataNegativeTestJSON-1677081655 tempest-ServerMetadataNegativeTestJSON-1677081655-project-member] Lock "0ed4be35-b845-48ca-b892-657d96c12728" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.646s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 770.619908] env[59659]: DEBUG nova.network.neutron [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 770.632366] env[59659]: DEBUG nova.network.neutron [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 770.643464] env[59659]: INFO nova.compute.manager [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] [instance: 938a2016-8eaa-446a-b69c-3af59448d944] Took 0.07 seconds to deallocate network for instance. [ 770.654967] env[59659]: DEBUG nova.network.neutron [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 770.671479] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Releasing lock "refresh_cache-72a92098-562e-47bf-8dde-8b62b182d7bb" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 770.671700] env[59659]: DEBUG nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 770.671878] env[59659]: DEBUG nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Deallocating network for instance {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 770.672051] env[59659]: DEBUG nova.network.neutron [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] deallocate_for_instance() {{(pid=59659) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 770.716318] env[59659]: DEBUG nova.network.neutron [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 770.724220] env[59659]: DEBUG nova.network.neutron [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 770.754503] env[59659]: INFO nova.compute.manager [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] [instance: 72a92098-562e-47bf-8dde-8b62b182d7bb] Took 0.08 seconds to deallocate network for instance. [ 770.805481] env[59659]: INFO nova.scheduler.client.report [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Deleted allocations for instance 938a2016-8eaa-446a-b69c-3af59448d944 [ 770.827650] env[59659]: DEBUG oslo_concurrency.lockutils [None req-a286a9f0-510d-4024-931c-d2c960680507 tempest-ListServerFiltersTestJSON-278412246 tempest-ListServerFiltersTestJSON-278412246-project-member] Lock "938a2016-8eaa-446a-b69c-3af59448d944" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.983s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 770.922836] env[59659]: INFO nova.scheduler.client.report [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Deleted allocations for instance 72a92098-562e-47bf-8dde-8b62b182d7bb [ 770.960473] env[59659]: DEBUG oslo_concurrency.lockutils [None req-bbfcb3a2-6871-4da0-9d3f-41fc23dce976 tempest-AttachVolumeNegativeTest-1432714470 tempest-AttachVolumeNegativeTest-1432714470-project-member] Lock "72a92098-562e-47bf-8dde-8b62b182d7bb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.091s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 795.615972] env[59659]: WARNING oslo_vmware.rw_handles [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 795.615972] env[59659]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 795.615972] env[59659]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 795.615972] env[59659]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 795.615972] env[59659]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 795.615972] env[59659]: ERROR oslo_vmware.rw_handles response.begin() [ 795.615972] env[59659]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 795.615972] env[59659]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 795.615972] env[59659]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 795.615972] env[59659]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 795.615972] env[59659]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 795.615972] env[59659]: ERROR oslo_vmware.rw_handles [ 795.615972] env[59659]: DEBUG nova.virt.vmwareapi.images [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Downloaded image file data 0fa786c9-f55e-46dc-b725-aa456ca9ff53 to vmware_temp/4d9e4f65-b9e9-4426-ac23-2a2e2760d555/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59659) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 795.617188] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Caching image {{(pid=59659) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 795.617449] env[59659]: DEBUG nova.virt.vmwareapi.vm_util [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Copying Virtual Disk [datastore1] vmware_temp/4d9e4f65-b9e9-4426-ac23-2a2e2760d555/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk to [datastore1] vmware_temp/4d9e4f65-b9e9-4426-ac23-2a2e2760d555/0fa786c9-f55e-46dc-b725-aa456ca9ff53/0fa786c9-f55e-46dc-b725-aa456ca9ff53.vmdk {{(pid=59659) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 795.617746] env[59659]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4afab4ad-acdd-43ea-97f6-95119765fa99 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 795.631017] env[59659]: DEBUG oslo_vmware.api [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Waiting for the task: (returnval){ [ 795.631017] env[59659]: value = "task-1384557" [ 795.631017] env[59659]: _type = "Task" [ 795.631017] env[59659]: } to complete. {{(pid=59659) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 795.639361] env[59659]: DEBUG oslo_vmware.api [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Task: {'id': task-1384557, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 796.142574] env[59659]: DEBUG oslo_vmware.exceptions [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Fault InvalidArgument not matched. {{(pid=59659) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 796.143025] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53/0fa786c9-f55e-46dc-b725-aa456ca9ff53.vmdk" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 796.143648] env[59659]: ERROR nova.compute.manager [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 796.143648] env[59659]: Faults: ['InvalidArgument'] [ 796.143648] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Traceback (most recent call last): [ 796.143648] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 796.143648] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] yield resources [ 796.143648] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 796.143648] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] self.driver.spawn(context, instance, image_meta, [ 796.143648] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 796.143648] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 796.143648] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 796.143648] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] self._fetch_image_if_missing(context, vi) [ 796.143648] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 796.144151] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] image_cache(vi, tmp_image_ds_loc) [ 796.144151] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 796.144151] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] vm_util.copy_virtual_disk( [ 796.144151] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 796.144151] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] session._wait_for_task(vmdk_copy_task) [ 796.144151] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 796.144151] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] return self.wait_for_task(task_ref) [ 796.144151] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 796.144151] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] return evt.wait() [ 796.144151] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 796.144151] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] result = hub.switch() [ 796.144151] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 796.144151] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] return self.greenlet.switch() [ 796.144561] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 796.144561] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] self.f(*self.args, **self.kw) [ 796.144561] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 796.144561] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] raise exceptions.translate_fault(task_info.error) [ 796.144561] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 796.144561] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Faults: ['InvalidArgument'] [ 796.144561] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] [ 796.144561] env[59659]: INFO nova.compute.manager [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Terminating instance [ 796.146840] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquired lock "[datastore1] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53/0fa786c9-f55e-46dc-b725-aa456ca9ff53.vmdk" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 796.150883] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59659) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 796.150883] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquiring lock "refresh_cache-78ed17da-e8e8-4872-b1bf-95c4e77de8e6" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 796.150883] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquired lock "refresh_cache-78ed17da-e8e8-4872-b1bf-95c4e77de8e6" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 796.150883] env[59659]: DEBUG nova.network.neutron [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 796.150883] env[59659]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6138dbd6-7d69-45da-8c4e-70eebfbb18f8 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.160099] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59659) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 796.160322] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59659) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 796.161122] env[59659]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7fa3bda8-5a37-4589-8e77-62953f4e3446 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.172319] env[59659]: DEBUG oslo_vmware.api [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Waiting for the task: (returnval){ [ 796.172319] env[59659]: value = "session[522b4e95-5a6c-8a31-9bee-9871c53e7a49]529b26e1-728a-e66a-72de-8d4ef5e79d1d" [ 796.172319] env[59659]: _type = "Task" [ 796.172319] env[59659]: } to complete. {{(pid=59659) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 796.184077] env[59659]: DEBUG oslo_vmware.api [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Task: {'id': session[522b4e95-5a6c-8a31-9bee-9871c53e7a49]529b26e1-728a-e66a-72de-8d4ef5e79d1d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 796.261339] env[59659]: DEBUG nova.network.neutron [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 796.686678] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Preparing fetch location {{(pid=59659) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 796.686981] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Creating directory with path [datastore1] vmware_temp/80c09f77-448b-4143-8c4f-6555727a2833/0fa786c9-f55e-46dc-b725-aa456ca9ff53 {{(pid=59659) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 796.687804] env[59659]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cbfdb857-828d-4c3f-a198-dcd82a718d15 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.700269] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Created directory with path [datastore1] vmware_temp/80c09f77-448b-4143-8c4f-6555727a2833/0fa786c9-f55e-46dc-b725-aa456ca9ff53 {{(pid=59659) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 796.700269] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Fetch image to [datastore1] vmware_temp/80c09f77-448b-4143-8c4f-6555727a2833/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk {{(pid=59659) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 796.700269] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Downloading image file data 0fa786c9-f55e-46dc-b725-aa456ca9ff53 to [datastore1] vmware_temp/80c09f77-448b-4143-8c4f-6555727a2833/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59659) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 796.701012] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33bc6edd-ebe9-4d6e-ae08-ca134e073e39 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.711591] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffb267a1-7023-42de-a8c6-9678352557ce {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.723694] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6685c582-badd-473d-95e4-d19c26e59f3f {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.762231] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff727081-14a3-4b53-8539-97782a267431 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.770082] env[59659]: DEBUG nova.network.neutron [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 796.771305] env[59659]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b0ef1a79-49f8-47eb-a710-a8501cdf983e {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.787033] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Releasing lock "refresh_cache-78ed17da-e8e8-4872-b1bf-95c4e77de8e6" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 796.787480] env[59659]: DEBUG nova.compute.manager [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 796.787675] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 796.789086] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7759170f-fabf-4567-9a2d-259fe2fd97f9 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.797882] env[59659]: DEBUG nova.virt.vmwareapi.images [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Downloading image file data 0fa786c9-f55e-46dc-b725-aa456ca9ff53 to the data store datastore1 {{(pid=59659) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 796.803035] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Unregistering the VM {{(pid=59659) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 796.803266] env[59659]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d9a01773-2510-46a1-93d2-b7f224056f48 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.833972] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Unregistered the VM {{(pid=59659) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 796.834188] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Deleting contents of the VM from datastore datastore1 {{(pid=59659) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 796.834353] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Deleting the datastore file [datastore1] 78ed17da-e8e8-4872-b1bf-95c4e77de8e6 {{(pid=59659) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 796.834611] env[59659]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6fcf1c37-7060-4f7f-adef-61e1d8e58144 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.845650] env[59659]: DEBUG oslo_vmware.api [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Waiting for the task: (returnval){ [ 796.845650] env[59659]: value = "task-1384559" [ 796.845650] env[59659]: _type = "Task" [ 796.845650] env[59659]: } to complete. {{(pid=59659) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 796.857500] env[59659]: DEBUG oslo_vmware.api [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Task: {'id': task-1384559, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 796.873278] env[59659]: DEBUG oslo_vmware.rw_handles [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/80c09f77-448b-4143-8c4f-6555727a2833/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59659) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 796.937951] env[59659]: DEBUG oslo_vmware.rw_handles [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Completed reading data from the image iterator. {{(pid=59659) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 796.938096] env[59659]: DEBUG oslo_vmware.rw_handles [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/80c09f77-448b-4143-8c4f-6555727a2833/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59659) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 797.355943] env[59659]: DEBUG oslo_vmware.api [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Task: {'id': task-1384559, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.033672} completed successfully. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 797.355943] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Deleted the datastore file {{(pid=59659) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 797.355943] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Deleted contents of the VM from datastore datastore1 {{(pid=59659) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 797.355943] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 797.355943] env[59659]: INFO nova.compute.manager [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Took 0.57 seconds to destroy the instance on the hypervisor. [ 797.356244] env[59659]: DEBUG oslo.service.loopingcall [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 797.356244] env[59659]: DEBUG nova.compute.manager [-] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Skipping network deallocation for instance since networking was not requested. {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 797.358810] env[59659]: DEBUG nova.compute.claims [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 797.359178] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 797.359515] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 797.444850] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-839e9bf1-d767-4dcc-afc1-01bd4372fedd {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.452796] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f665f1a-a898-459d-a966-f55a2fcaf056 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.488382] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-716f792b-90bc-4287-afe2-7739f3e5d8cc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.496685] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5202b288-22e1-41b5-86f9-157a958de551 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.513029] env[59659]: DEBUG nova.compute.provider_tree [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Updating inventory in ProviderTree for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 797.546046] env[59659]: ERROR nova.scheduler.client.report [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [req-9e967ec0-71e1-4436-891e-60d43bd4fc0f] Failed to update inventory to [{'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}}] for resource provider with UUID 69a84459-8a9e-4a6c-afd9-ec42e61132ce. Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict ", "code": "placement.concurrent_update", "request_id": "req-9e967ec0-71e1-4436-891e-60d43bd4fc0f"}]}: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 797.562889] env[59659]: DEBUG nova.scheduler.client.report [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Refreshing inventories for resource provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 797.577163] env[59659]: DEBUG nova.scheduler.client.report [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Updating ProviderTree inventory for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 797.577599] env[59659]: DEBUG nova.compute.provider_tree [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Updating inventory in ProviderTree for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 176, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 797.596834] env[59659]: DEBUG nova.scheduler.client.report [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Refreshing aggregate associations for resource provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce, aggregates: None {{(pid=59659) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 797.625915] env[59659]: DEBUG nova.scheduler.client.report [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Refreshing trait associations for resource provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce, traits: COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=59659) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 797.679888] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6108a146-94ee-450a-802a-8accc7205497 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.687552] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd7ff840-a92d-4799-90ef-ec6597bcb68c {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.720820] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ac5312a-046d-4be4-8b75-ba0eb665a805 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.729444] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d91a296-c35e-4b95-a813-1dcf5267c692 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.743995] env[59659]: DEBUG nova.compute.provider_tree [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Updating inventory in ProviderTree for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 797.793189] env[59659]: DEBUG nova.scheduler.client.report [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Updated inventory for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce with generation 45 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 797.793335] env[59659]: DEBUG nova.compute.provider_tree [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Updating resource provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce generation from 45 to 46 during operation: update_inventory {{(pid=59659) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 797.793489] env[59659]: DEBUG nova.compute.provider_tree [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Updating inventory in ProviderTree for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 797.820091] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.460s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 797.820501] env[59659]: ERROR nova.compute.manager [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 797.820501] env[59659]: Faults: ['InvalidArgument'] [ 797.820501] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Traceback (most recent call last): [ 797.820501] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 797.820501] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] self.driver.spawn(context, instance, image_meta, [ 797.820501] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 797.820501] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 797.820501] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 797.820501] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] self._fetch_image_if_missing(context, vi) [ 797.820501] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 797.820501] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] image_cache(vi, tmp_image_ds_loc) [ 797.820501] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 797.820882] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] vm_util.copy_virtual_disk( [ 797.820882] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 797.820882] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] session._wait_for_task(vmdk_copy_task) [ 797.820882] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 797.820882] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] return self.wait_for_task(task_ref) [ 797.820882] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 797.820882] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] return evt.wait() [ 797.820882] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 797.820882] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] result = hub.switch() [ 797.820882] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 797.820882] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] return self.greenlet.switch() [ 797.820882] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 797.820882] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] self.f(*self.args, **self.kw) [ 797.821251] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 797.821251] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] raise exceptions.translate_fault(task_info.error) [ 797.821251] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 797.821251] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Faults: ['InvalidArgument'] [ 797.821251] env[59659]: ERROR nova.compute.manager [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] [ 797.821251] env[59659]: DEBUG nova.compute.utils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] VimFaultException {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 797.823051] env[59659]: DEBUG nova.compute.manager [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Build of instance 78ed17da-e8e8-4872-b1bf-95c4e77de8e6 was re-scheduled: A specified parameter was not correct: fileType [ 797.823051] env[59659]: Faults: ['InvalidArgument'] {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 797.823539] env[59659]: DEBUG nova.compute.manager [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 797.825435] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquiring lock "refresh_cache-78ed17da-e8e8-4872-b1bf-95c4e77de8e6" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 797.825608] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquired lock "refresh_cache-78ed17da-e8e8-4872-b1bf-95c4e77de8e6" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 797.825770] env[59659]: DEBUG nova.network.neutron [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 797.876602] env[59659]: DEBUG nova.network.neutron [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 798.069520] env[59659]: DEBUG nova.network.neutron [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 798.080227] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Releasing lock "refresh_cache-78ed17da-e8e8-4872-b1bf-95c4e77de8e6" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 798.080375] env[59659]: DEBUG nova.compute.manager [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 798.080558] env[59659]: DEBUG nova.compute.manager [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: 78ed17da-e8e8-4872-b1bf-95c4e77de8e6] Skipping network deallocation for instance since networking was not requested. {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 798.193920] env[59659]: INFO nova.scheduler.client.report [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Deleted allocations for instance 78ed17da-e8e8-4872-b1bf-95c4e77de8e6 [ 798.217147] env[59659]: DEBUG oslo_concurrency.lockutils [None req-8b302e70-ca17-499e-9ee1-9a7ccf961b0c tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Lock "78ed17da-e8e8-4872-b1bf-95c4e77de8e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 55.486s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 814.035185] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 814.035465] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Starting heal instance info cache {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 814.035539] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Rebuilding the list of instances to heal {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 814.045137] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Skipping network cache update for instance because it is Building. {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 814.045283] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Didn't find any instances for network info cache update. {{(pid=59659) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 815.027233] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 816.023766] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 816.034841] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 817.027762] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 817.028209] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 818.027469] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 818.027741] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 818.039013] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 818.039399] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 818.039468] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 818.039572] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59659) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 818.040630] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e533c70-baee-45d2-92a7-af96de227177 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.049236] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ef0a69b-0735-48de-be42-b504c20695f7 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.063279] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e9f0945-9b0d-4d92-b5c7-92a533896185 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.069570] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-785721ee-187b-464e-bf1e-3ca06b82ed2d {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.097504] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181414MB free_disk=177GB free_vcpus=48 pci_devices=None {{(pid=59659) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 818.097678] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 818.097825] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 818.133448] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Instance a75a3491-94b0-4754-8e42-7bf49194a022 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59659) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 818.133637] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=59659) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 818.133777] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=59659) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 818.157297] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-286cd988-f3d6-4fbd-ba50-81839d06efe9 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.164301] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-481c281c-9e8d-4839-9edb-a5360e02f238 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.194010] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b4983d9-22c0-4231-aa79-dc3acf7852a4 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.200591] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0becc509-5b51-4234-91e8-adf10030b9b2 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.212946] env[59659]: DEBUG nova.compute.provider_tree [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 818.220482] env[59659]: DEBUG nova.scheduler.client.report [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 818.232733] env[59659]: DEBUG nova.compute.resource_tracker [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59659) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 818.232900] env[59659]: DEBUG oslo_concurrency.lockutils [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 819.232530] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 819.232956] env[59659]: DEBUG oslo_service.periodic_task [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59659) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 819.232956] env[59659]: DEBUG nova.compute.manager [None req-d68d156f-4335-409d-8cbf-671e31ef1543 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59659) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 845.633671] env[59659]: WARNING oslo_vmware.rw_handles [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 845.633671] env[59659]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 845.633671] env[59659]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 845.633671] env[59659]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 845.633671] env[59659]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 845.633671] env[59659]: ERROR oslo_vmware.rw_handles response.begin() [ 845.633671] env[59659]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 845.633671] env[59659]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 845.633671] env[59659]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 845.633671] env[59659]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 845.633671] env[59659]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 845.633671] env[59659]: ERROR oslo_vmware.rw_handles [ 845.634741] env[59659]: DEBUG nova.virt.vmwareapi.images [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Downloaded image file data 0fa786c9-f55e-46dc-b725-aa456ca9ff53 to vmware_temp/80c09f77-448b-4143-8c4f-6555727a2833/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59659) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 845.635839] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Caching image {{(pid=59659) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 845.636119] env[59659]: DEBUG nova.virt.vmwareapi.vm_util [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Copying Virtual Disk [datastore1] vmware_temp/80c09f77-448b-4143-8c4f-6555727a2833/0fa786c9-f55e-46dc-b725-aa456ca9ff53/tmp-sparse.vmdk to [datastore1] vmware_temp/80c09f77-448b-4143-8c4f-6555727a2833/0fa786c9-f55e-46dc-b725-aa456ca9ff53/0fa786c9-f55e-46dc-b725-aa456ca9ff53.vmdk {{(pid=59659) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 845.636425] env[59659]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-dc5d1390-c662-4126-8223-86362d51ef0a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 845.644704] env[59659]: DEBUG oslo_vmware.api [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Waiting for the task: (returnval){ [ 845.644704] env[59659]: value = "task-1384560" [ 845.644704] env[59659]: _type = "Task" [ 845.644704] env[59659]: } to complete. {{(pid=59659) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 845.652466] env[59659]: DEBUG oslo_vmware.api [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Task: {'id': task-1384560, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 846.155404] env[59659]: DEBUG oslo_vmware.api [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Task: {'id': task-1384560, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 846.655690] env[59659]: DEBUG oslo_vmware.api [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Task: {'id': task-1384560, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 847.156349] env[59659]: DEBUG oslo_vmware.api [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Task: {'id': task-1384560, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 847.657657] env[59659]: DEBUG oslo_vmware.api [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Task: {'id': task-1384560, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 848.158063] env[59659]: DEBUG oslo_vmware.api [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Task: {'id': task-1384560, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 848.661047] env[59659]: DEBUG oslo_vmware.exceptions [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Fault InvalidArgument not matched. {{(pid=59659) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 848.661431] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Releasing lock "[datastore1] devstack-image-cache_base/0fa786c9-f55e-46dc-b725-aa456ca9ff53/0fa786c9-f55e-46dc-b725-aa456ca9ff53.vmdk" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 848.661845] env[59659]: ERROR nova.compute.manager [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 848.661845] env[59659]: Faults: ['InvalidArgument'] [ 848.661845] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Traceback (most recent call last): [ 848.661845] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 848.661845] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] yield resources [ 848.661845] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 848.661845] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] self.driver.spawn(context, instance, image_meta, [ 848.661845] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 848.661845] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] self._vmops.spawn(context, instance, image_meta, injected_files, [ 848.661845] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 848.661845] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] self._fetch_image_if_missing(context, vi) [ 848.661845] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 848.662410] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] image_cache(vi, tmp_image_ds_loc) [ 848.662410] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 848.662410] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] vm_util.copy_virtual_disk( [ 848.662410] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 848.662410] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] session._wait_for_task(vmdk_copy_task) [ 848.662410] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 848.662410] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] return self.wait_for_task(task_ref) [ 848.662410] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 848.662410] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] return evt.wait() [ 848.662410] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 848.662410] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] result = hub.switch() [ 848.662410] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 848.662410] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] return self.greenlet.switch() [ 848.662947] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 848.662947] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] self.f(*self.args, **self.kw) [ 848.662947] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 848.662947] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] raise exceptions.translate_fault(task_info.error) [ 848.662947] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 848.662947] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Faults: ['InvalidArgument'] [ 848.662947] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] [ 848.662947] env[59659]: INFO nova.compute.manager [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Terminating instance [ 848.664855] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquiring lock "refresh_cache-a75a3491-94b0-4754-8e42-7bf49194a022" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 848.665012] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquired lock "refresh_cache-a75a3491-94b0-4754-8e42-7bf49194a022" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 848.665179] env[59659]: DEBUG nova.network.neutron [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 848.688749] env[59659]: DEBUG nova.network.neutron [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 848.744735] env[59659]: DEBUG nova.network.neutron [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 848.753345] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Releasing lock "refresh_cache-a75a3491-94b0-4754-8e42-7bf49194a022" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 848.753710] env[59659]: DEBUG nova.compute.manager [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Start destroying the instance on the hypervisor. {{(pid=59659) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 848.753893] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Destroying instance {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 848.754894] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a6b66dc-8272-4099-883f-35454fc2db4a {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.762686] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Unregistering the VM {{(pid=59659) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 848.762930] env[59659]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6e07a079-ff95-4dbb-91f0-dd18cf86a2ee {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.785358] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Unregistered the VM {{(pid=59659) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 848.785541] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Deleting contents of the VM from datastore datastore1 {{(pid=59659) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 848.785711] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Deleting the datastore file [datastore1] a75a3491-94b0-4754-8e42-7bf49194a022 {{(pid=59659) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 848.785934] env[59659]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-73f32880-3523-4a53-aa90-e57fb3f3cbf6 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.791405] env[59659]: DEBUG oslo_vmware.api [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Waiting for the task: (returnval){ [ 848.791405] env[59659]: value = "task-1384562" [ 848.791405] env[59659]: _type = "Task" [ 848.791405] env[59659]: } to complete. {{(pid=59659) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 848.799234] env[59659]: DEBUG oslo_vmware.api [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Task: {'id': task-1384562, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 849.300815] env[59659]: DEBUG oslo_vmware.api [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Task: {'id': task-1384562, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.033004} completed successfully. {{(pid=59659) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 849.301024] env[59659]: DEBUG nova.virt.vmwareapi.ds_util [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Deleted the datastore file {{(pid=59659) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 849.301183] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Deleted contents of the VM from datastore datastore1 {{(pid=59659) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 849.301349] env[59659]: DEBUG nova.virt.vmwareapi.vmops [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Instance destroyed {{(pid=59659) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 849.301514] env[59659]: INFO nova.compute.manager [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Took 0.55 seconds to destroy the instance on the hypervisor. [ 849.301742] env[59659]: DEBUG oslo.service.loopingcall [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59659) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 849.301928] env[59659]: DEBUG nova.compute.manager [-] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Skipping network deallocation for instance since networking was not requested. {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 849.304245] env[59659]: DEBUG nova.compute.claims [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Aborting claim: {{(pid=59659) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 849.304403] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 849.304600] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 849.362858] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-122201b9-a1aa-4b44-bb11-dd42c6c0e657 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.370463] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fa39532-dfe1-46a8-b76e-5a24f92b93cc {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.399753] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-861a4f4f-f9a5-43dd-a78a-c5c883e82278 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.406978] env[59659]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff401e89-989a-4156-bcb1-11f0fbef7ae6 {{(pid=59659) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.420937] env[59659]: DEBUG nova.compute.provider_tree [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Inventory has not changed in ProviderTree for provider: 69a84459-8a9e-4a6c-afd9-ec42e61132ce {{(pid=59659) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 849.428572] env[59659]: DEBUG nova.scheduler.client.report [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Inventory has not changed for provider 69a84459-8a9e-4a6c-afd9-ec42e61132ce based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 177, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59659) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 849.440643] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.136s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 849.441165] env[59659]: ERROR nova.compute.manager [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 849.441165] env[59659]: Faults: ['InvalidArgument'] [ 849.441165] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Traceback (most recent call last): [ 849.441165] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 849.441165] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] self.driver.spawn(context, instance, image_meta, [ 849.441165] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 849.441165] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] self._vmops.spawn(context, instance, image_meta, injected_files, [ 849.441165] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 849.441165] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] self._fetch_image_if_missing(context, vi) [ 849.441165] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 849.441165] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] image_cache(vi, tmp_image_ds_loc) [ 849.441165] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 849.441529] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] vm_util.copy_virtual_disk( [ 849.441529] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 849.441529] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] session._wait_for_task(vmdk_copy_task) [ 849.441529] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 849.441529] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] return self.wait_for_task(task_ref) [ 849.441529] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 849.441529] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] return evt.wait() [ 849.441529] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 849.441529] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] result = hub.switch() [ 849.441529] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 849.441529] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] return self.greenlet.switch() [ 849.441529] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 849.441529] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] self.f(*self.args, **self.kw) [ 849.441888] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 849.441888] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] raise exceptions.translate_fault(task_info.error) [ 849.441888] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 849.441888] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Faults: ['InvalidArgument'] [ 849.441888] env[59659]: ERROR nova.compute.manager [instance: a75a3491-94b0-4754-8e42-7bf49194a022] [ 849.441888] env[59659]: DEBUG nova.compute.utils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] VimFaultException {{(pid=59659) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 849.443129] env[59659]: DEBUG nova.compute.manager [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Build of instance a75a3491-94b0-4754-8e42-7bf49194a022 was re-scheduled: A specified parameter was not correct: fileType [ 849.443129] env[59659]: Faults: ['InvalidArgument'] {{(pid=59659) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 849.443493] env[59659]: DEBUG nova.compute.manager [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Unplugging VIFs for instance {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 849.443711] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquiring lock "refresh_cache-a75a3491-94b0-4754-8e42-7bf49194a022" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 849.443855] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Acquired lock "refresh_cache-a75a3491-94b0-4754-8e42-7bf49194a022" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 849.444014] env[59659]: DEBUG nova.network.neutron [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Building network info cache for instance {{(pid=59659) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 849.467857] env[59659]: DEBUG nova.network.neutron [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Instance cache missing network info. {{(pid=59659) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 849.524146] env[59659]: DEBUG nova.network.neutron [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Updating instance_info_cache with network_info: [] {{(pid=59659) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 849.532628] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Releasing lock "refresh_cache-a75a3491-94b0-4754-8e42-7bf49194a022" {{(pid=59659) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 849.532832] env[59659]: DEBUG nova.compute.manager [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59659) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 849.533008] env[59659]: DEBUG nova.compute.manager [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] [instance: a75a3491-94b0-4754-8e42-7bf49194a022] Skipping network deallocation for instance since networking was not requested. {{(pid=59659) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 849.613774] env[59659]: INFO nova.scheduler.client.report [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Deleted allocations for instance a75a3491-94b0-4754-8e42-7bf49194a022 [ 849.628163] env[59659]: DEBUG oslo_concurrency.lockutils [None req-6480e2eb-733f-45d4-9040-d43f38b81be4 tempest-ServerShowV247Test-1842064909 tempest-ServerShowV247Test-1842064909-project-member] Lock "a75a3491-94b0-4754-8e42-7bf49194a022" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 101.907s {{(pid=59659) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}}