[ 555.821642] env[59576]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 556.293815] env[59620]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 557.810763] env[59620]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=59620) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 557.811121] env[59620]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=59620) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 557.811239] env[59620]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=59620) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 557.811477] env[59620]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 557.812559] env[59620]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 557.928898] env[59620]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=59620) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 557.949100] env[59620]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.020s {{(pid=59620) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 558.048760] env[59620]: INFO nova.virt.driver [None req-c89bd55e-cb17-4a5d-bfb8-f61b210a500d None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 558.120834] env[59620]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 558.120983] env[59620]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 558.121069] env[59620]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=59620) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 561.291767] env[59620]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-2540a9db-a0d7-40e7-be2e-729340016c0a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.307603] env[59620]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=59620) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 561.307716] env[59620]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-cadf123b-b8ee-4144-a321-560f7469f874 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.347685] env[59620]: INFO oslo_vmware.api [-] Successfully established new session; session ID is f8929. [ 561.347832] env[59620]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.227s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 561.348494] env[59620]: INFO nova.virt.vmwareapi.driver [None req-c89bd55e-cb17-4a5d-bfb8-f61b210a500d None None] VMware vCenter version: 7.0.3 [ 561.351934] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58876797-6fc6-459c-a772-f1b4ee4b04b8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.368921] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3441788-7785-4e98-9bfe-a104a51f110c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.374872] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acb50869-9a67-4889-9988-0e9623e6ef40 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.382188] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5635dbf7-d278-4946-b76e-be89ecfdee42 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.395344] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-393a46d6-44e3-43fb-bc40-77c5c05c6c88 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.401540] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6caa9106-42cf-43be-98c3-6c3e9bac6142 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.431922] env[59620]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-150367d1-a4af-4905-aa21-30567f43a5db {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.437156] env[59620]: DEBUG nova.virt.vmwareapi.driver [None req-c89bd55e-cb17-4a5d-bfb8-f61b210a500d None None] Extension org.openstack.compute already exists. {{(pid=59620) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 561.439793] env[59620]: INFO nova.compute.provider_config [None req-c89bd55e-cb17-4a5d-bfb8-f61b210a500d None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 561.458098] env[59620]: DEBUG nova.context [None req-c89bd55e-cb17-4a5d-bfb8-f61b210a500d None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),ca1c0d21-d6a1-418c-abcf-92608d1b00f5(cell1) {{(pid=59620) load_cells /opt/stack/nova/nova/context.py:464}} [ 561.460135] env[59620]: DEBUG oslo_concurrency.lockutils [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 561.460352] env[59620]: DEBUG oslo_concurrency.lockutils [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 561.461086] env[59620]: DEBUG oslo_concurrency.lockutils [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 561.461435] env[59620]: DEBUG oslo_concurrency.lockutils [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] Acquiring lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 561.461631] env[59620]: DEBUG oslo_concurrency.lockutils [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 561.462595] env[59620]: DEBUG oslo_concurrency.lockutils [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] Lock "ca1c0d21-d6a1-418c-abcf-92608d1b00f5" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 561.475085] env[59620]: DEBUG oslo_db.sqlalchemy.engines [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59620) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 561.479973] env[59620]: DEBUG oslo_db.sqlalchemy.engines [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59620) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 561.481890] env[59620]: ERROR nova.db.main.api [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 561.481890] env[59620]: result = function(*args, **kwargs) [ 561.481890] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 561.481890] env[59620]: return func(*args, **kwargs) [ 561.481890] env[59620]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 561.481890] env[59620]: result = fn(*args, **kwargs) [ 561.481890] env[59620]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 561.481890] env[59620]: return f(*args, **kwargs) [ 561.481890] env[59620]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 561.481890] env[59620]: return db.service_get_minimum_version(context, binaries) [ 561.481890] env[59620]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 561.481890] env[59620]: _check_db_access() [ 561.481890] env[59620]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 561.481890] env[59620]: stacktrace = ''.join(traceback.format_stack()) [ 561.481890] env[59620]: [ 561.484108] env[59620]: ERROR nova.db.main.api [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 561.484108] env[59620]: result = function(*args, **kwargs) [ 561.484108] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 561.484108] env[59620]: return func(*args, **kwargs) [ 561.484108] env[59620]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 561.484108] env[59620]: result = fn(*args, **kwargs) [ 561.484108] env[59620]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 561.484108] env[59620]: return f(*args, **kwargs) [ 561.484108] env[59620]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 561.484108] env[59620]: return db.service_get_minimum_version(context, binaries) [ 561.484108] env[59620]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 561.484108] env[59620]: _check_db_access() [ 561.484108] env[59620]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 561.484108] env[59620]: stacktrace = ''.join(traceback.format_stack()) [ 561.484108] env[59620]: [ 561.485553] env[59620]: WARNING nova.objects.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 561.485553] env[59620]: WARNING nova.objects.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] Failed to get minimum service version for cell ca1c0d21-d6a1-418c-abcf-92608d1b00f5 [ 561.485553] env[59620]: DEBUG oslo_concurrency.lockutils [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] Acquiring lock "singleton_lock" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 561.485553] env[59620]: DEBUG oslo_concurrency.lockutils [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] Acquired lock "singleton_lock" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 561.485553] env[59620]: DEBUG oslo_concurrency.lockutils [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] Releasing lock "singleton_lock" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 561.485553] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] Full set of CONF: {{(pid=59620) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 561.485726] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ******************************************************************************** {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 561.485726] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] Configuration options gathered from: {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 561.488018] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 561.488018] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 561.488018] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ================================================================================ {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 561.488018] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] allow_resize_to_same_host = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488018] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] arq_binding_timeout = 300 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488018] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] backdoor_port = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488249] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] backdoor_socket = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488249] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] block_device_allocate_retries = 60 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488249] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] block_device_allocate_retries_interval = 3 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488249] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cert = self.pem {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488249] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488249] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] compute_monitors = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488249] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] config_dir = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488467] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] config_drive_format = iso9660 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488467] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488467] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] config_source = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488467] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] console_host = devstack {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488581] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] control_exchange = nova {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488730] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cpu_allocation_ratio = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488811] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] daemon = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.488968] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] debug = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.489141] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] default_access_ip_network_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.489302] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] default_availability_zone = nova {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.490235] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] default_ephemeral_format = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.490235] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.490235] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] default_schedule_zone = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.490235] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] disk_allocation_ratio = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.490235] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] enable_new_services = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.490640] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] enabled_apis = ['osapi_compute'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.490640] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] enabled_ssl_apis = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.490640] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] flat_injected = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.490793] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] force_config_drive = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.490913] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] force_raw_images = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.491085] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] graceful_shutdown_timeout = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.491241] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] heal_instance_info_cache_interval = 60 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.491451] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] host = cpu-1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.491610] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.491761] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] initial_disk_allocation_ratio = 1.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.492038] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] initial_ram_allocation_ratio = 1.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.492239] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.492321] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] instance_build_timeout = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494237] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] instance_delete_interval = 300 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494237] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] instance_format = [instance: %(uuid)s] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494237] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] instance_name_template = instance-%08x {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494237] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] instance_usage_audit = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494237] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] instance_usage_audit_period = month {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494237] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494643] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] instances_path = /opt/stack/data/nova/instances {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494643] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] internal_service_availability_zone = internal {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494643] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] key = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494643] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] live_migration_retry_count = 30 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494643] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] log_config_append = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494643] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494918] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] log_dir = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494918] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] log_file = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494918] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] log_options = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494918] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] log_rotate_interval = 1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.494918] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] log_rotate_interval_type = days {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.495154] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] log_rotation_type = none {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.495154] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.495298] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.495379] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.495531] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.495647] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.495798] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] long_rpc_timeout = 1800 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.496065] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] max_concurrent_builds = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.496157] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] max_concurrent_live_migrations = 1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.496666] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] max_concurrent_snapshots = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.496666] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] max_local_block_devices = 3 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.496666] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] max_logfile_count = 30 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.496840] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] max_logfile_size_mb = 200 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.496840] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] maximum_instance_delete_attempts = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.496979] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] metadata_listen = 0.0.0.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.497141] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] metadata_listen_port = 8775 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.497299] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] metadata_workers = 2 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.497946] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] migrate_max_retries = -1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.497946] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] mkisofs_cmd = genisoimage {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.497946] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] my_block_storage_ip = 10.180.1.21 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.497946] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] my_ip = 10.180.1.21 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500260] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] network_allocate_retries = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500260] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500260] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] osapi_compute_listen = 0.0.0.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500260] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] osapi_compute_listen_port = 8774 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500260] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] osapi_compute_unique_server_name_scope = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500260] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] osapi_compute_workers = 2 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500260] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] password_length = 12 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500456] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] periodic_enable = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500456] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] periodic_fuzzy_delay = 60 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500456] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] pointer_model = usbtablet {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500456] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] preallocate_images = none {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500456] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] publish_errors = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500456] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] pybasedir = /opt/stack/nova {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500456] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ram_allocation_ratio = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500639] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] rate_limit_burst = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500639] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] rate_limit_except_level = CRITICAL {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500802] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] rate_limit_interval = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.500926] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] reboot_timeout = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.501104] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] reclaim_instance_interval = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.501254] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] record = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.501414] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] reimage_timeout_per_gb = 60 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.501573] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] report_interval = 120 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.501726] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] rescue_timeout = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.501875] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] reserved_host_cpus = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.502045] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] reserved_host_disk_mb = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.502203] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] reserved_host_memory_mb = 512 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.502354] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] reserved_huge_pages = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.502503] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] resize_confirm_window = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.503139] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] resize_fs_using_block_device = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.503139] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] resume_guests_state_on_host_boot = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.503139] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.503139] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] rpc_response_timeout = 60 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.503264] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] run_external_periodic_tasks = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505071] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] running_deleted_instance_action = reap {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505071] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] running_deleted_instance_poll_interval = 1800 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505071] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] running_deleted_instance_timeout = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505071] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] scheduler_instance_sync_interval = 120 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505071] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] service_down_time = 300 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505071] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] servicegroup_driver = db {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505071] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] shelved_offload_time = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505372] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] shelved_poll_interval = 3600 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505372] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] shutdown_timeout = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505372] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] source_is_ipv6 = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505372] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ssl_only = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505372] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505372] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] sync_power_state_interval = 600 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505530] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] sync_power_state_pool_size = 1000 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505666] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] syslog_log_facility = LOG_USER {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505730] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] tempdir = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.505876] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] timeout_nbd = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.506048] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] transport_url = **** {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.506203] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] update_resources_interval = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.506383] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] use_cow_images = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.506539] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] use_eventlog = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.506810] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] use_journal = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.506899] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] use_json = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.507152] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] use_rootwrap_daemon = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.507152] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] use_stderr = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.507749] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] use_syslog = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.507749] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vcpu_pin_set = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.507749] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vif_plugging_is_fatal = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.507749] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vif_plugging_timeout = 300 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.508227] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] virt_mkfs = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.508227] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] volume_usage_poll_interval = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.508227] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] watch_log_file = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.508608] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] web = /usr/share/spice-html5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 561.508677] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_concurrency.disable_process_locking = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.508909] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.509144] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.509760] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.509760] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.509760] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.509760] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.510237] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.auth_strategy = keystone {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.510237] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.compute_link_prefix = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.510335] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.510718] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.dhcp_domain = novalocal {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.510718] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.enable_instance_password = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.510718] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.glance_link_prefix = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.511609] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.511609] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.511609] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.instance_list_per_project_cells = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.511609] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.list_records_by_skipping_down_cells = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.511609] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.local_metadata_per_cell = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.511972] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.max_limit = 1000 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.511972] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.metadata_cache_expiration = 15 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.512083] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.neutron_default_tenant_id = default {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.512224] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.use_forwarded_for = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.512385] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.use_neutron_default_nets = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.512549] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.513013] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.513013] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.513158] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.513214] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.vendordata_dynamic_targets = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.513347] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.vendordata_jsonfile_path = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.513547] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.513737] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.backend = dogpile.cache.memcached {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.513899] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.backend_argument = **** {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.514086] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.config_prefix = cache.oslo {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.514257] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.dead_timeout = 60.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.514425] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.debug_cache_backend = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.514583] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.enable_retry_client = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.514738] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.enable_socket_keepalive = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.514901] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.enabled = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.515075] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.expiration_time = 600 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.515234] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.hashclient_retry_attempts = 2 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.515397] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.hashclient_retry_delay = 1.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.515591] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.memcache_dead_retry = 300 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.515778] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.memcache_password = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.515942] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.516113] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.516302] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.memcache_pool_maxsize = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.516460] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.516623] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.memcache_sasl_enabled = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.516798] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.516960] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.memcache_socket_timeout = 1.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.517137] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.memcache_username = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.517300] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.proxies = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.517461] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.retry_attempts = 2 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.517624] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.retry_delay = 0.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.517784] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.socket_keepalive_count = 1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.517941] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.socket_keepalive_idle = 1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.518122] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.socket_keepalive_interval = 1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.518279] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.tls_allowed_ciphers = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.518536] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.tls_cafile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.518700] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.tls_certfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.518865] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.tls_enabled = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.519032] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cache.tls_keyfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.519203] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cinder.auth_section = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.519375] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cinder.auth_type = password {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.519532] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cinder.cafile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.519700] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cinder.catalog_info = volumev3::publicURL {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.519855] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cinder.certfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.520023] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cinder.collect_timing = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.520200] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cinder.cross_az_attach = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.520358] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cinder.debug = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.520513] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cinder.endpoint_template = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.520669] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cinder.http_retries = 3 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.520863] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cinder.insecure = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.520974] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cinder.keyfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.521151] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cinder.os_region_name = RegionOne {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.521310] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cinder.split_loggers = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.521464] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cinder.timeout = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.521631] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.521786] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] compute.cpu_dedicated_set = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.521945] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] compute.cpu_shared_set = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.522103] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] compute.image_type_exclude_list = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.522261] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.522418] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] compute.max_concurrent_disk_ops = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.522582] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] compute.max_disk_devices_to_attach = -1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.522736] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.522899] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.523066] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] compute.resource_provider_association_refresh = 300 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.523224] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] compute.shutdown_retry_interval = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.523401] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.523576] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] conductor.workers = 2 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.523755] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] console.allowed_origins = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.523914] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] console.ssl_ciphers = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.524092] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] console.ssl_minimum_version = default {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.524261] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] consoleauth.token_ttl = 600 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.524428] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.cafile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.524583] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.certfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.524741] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.collect_timing = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.524893] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.connect_retries = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.525070] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.connect_retry_delay = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.525231] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.endpoint_override = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.525391] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.insecure = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.525583] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.keyfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.525812] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.max_version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.525989] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.min_version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.526162] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.region_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.526317] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.service_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.526486] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.service_type = accelerator {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.526647] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.split_loggers = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.526800] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.status_code_retries = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.526953] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.status_code_retry_delay = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.527119] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.timeout = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.527298] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.527453] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] cyborg.version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.527634] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.backend = sqlalchemy {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.527806] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.connection = **** {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.527972] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.connection_debug = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.528150] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.connection_parameters = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.528318] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.connection_recycle_time = 3600 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.528627] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.connection_trace = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.528815] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.db_inc_retry_interval = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.528983] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.db_max_retries = 20 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.529173] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.db_max_retry_interval = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.529338] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.db_retry_interval = 1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.529510] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.max_overflow = 50 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.529673] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.max_pool_size = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.529837] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.max_retries = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.529997] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.mysql_enable_ndb = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.530179] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.530337] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.mysql_wsrep_sync_wait = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.530496] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.pool_timeout = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.530664] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.retry_interval = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.530818] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.slave_connection = **** {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.530983] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.sqlite_synchronous = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.531153] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] database.use_db_reconnect = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.531332] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.backend = sqlalchemy {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.531514] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.connection = **** {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.531683] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.connection_debug = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.531849] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.connection_parameters = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.532016] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.connection_recycle_time = 3600 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.532183] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.connection_trace = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.532345] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.db_inc_retry_interval = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.532503] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.db_max_retries = 20 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.532663] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.db_max_retry_interval = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.532819] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.db_retry_interval = 1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.532983] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.max_overflow = 50 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.533167] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.max_pool_size = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.533332] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.max_retries = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.533491] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.mysql_enable_ndb = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.533658] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.533816] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.533971] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.pool_timeout = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.534414] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.retry_interval = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.534591] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.slave_connection = **** {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.534762] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] api_database.sqlite_synchronous = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.534936] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] devices.enabled_mdev_types = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.535128] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.535296] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ephemeral_storage_encryption.enabled = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.535459] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.535631] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.api_servers = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.535792] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.cafile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.535948] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.certfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.536119] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.collect_timing = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.536294] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.connect_retries = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.536466] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.connect_retry_delay = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.536634] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.debug = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.536795] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.default_trusted_certificate_ids = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.536953] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.enable_certificate_validation = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.537139] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.enable_rbd_download = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.537302] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.endpoint_override = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.537466] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.insecure = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.537677] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.keyfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.537867] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.max_version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.538047] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.min_version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.538215] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.num_retries = 3 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.538417] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.rbd_ceph_conf = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.538590] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.rbd_connect_timeout = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.538761] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.rbd_pool = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.538928] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.rbd_user = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.539095] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.region_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.539252] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.service_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.539418] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.service_type = image {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.539581] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.split_loggers = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.539740] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.status_code_retries = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.539895] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.status_code_retry_delay = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.540071] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.timeout = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.540252] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.540417] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.verify_glance_signatures = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.540577] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] glance.version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.540772] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] guestfs.debug = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.540913] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.config_drive_cdrom = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.541089] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.config_drive_inject_password = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.541253] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.541414] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.enable_instance_metrics_collection = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.541578] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.enable_remotefx = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.541754] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.instances_path_share = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.541920] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.iscsi_initiator_list = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.542104] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.limit_cpu_features = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.542275] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.542438] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.542611] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.power_state_check_timeframe = 60 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.542771] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.542940] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.543113] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.use_multipath_io = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.543276] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.volume_attach_retry_count = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.543435] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.543595] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.vswitch_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.543756] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.543924] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] mks.enabled = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.544280] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.544472] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] image_cache.manager_interval = 2400 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.544639] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] image_cache.precache_concurrency = 1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.544811] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] image_cache.remove_unused_base_images = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.544977] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.545153] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.545326] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] image_cache.subdirectory_name = _base {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.545497] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.api_max_retries = 60 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.545660] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.api_retry_interval = 2 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.545818] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.auth_section = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.545975] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.auth_type = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.546155] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.cafile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.546328] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.certfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.546500] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.collect_timing = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.546685] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.connect_retries = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.546813] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.connect_retry_delay = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.546966] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.endpoint_override = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.547137] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.insecure = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.547292] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.keyfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.547453] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.max_version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.547606] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.min_version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.547762] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.partition_key = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.547923] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.peer_list = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.548087] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.region_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.548251] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.serial_console_state_timeout = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.548438] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.service_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.548624] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.service_type = baremetal {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.548773] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.split_loggers = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.548933] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.status_code_retries = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.549099] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.status_code_retry_delay = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.549254] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.timeout = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.549470] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.549633] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ironic.version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.549816] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.549979] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] key_manager.fixed_key = **** {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.550186] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.550351] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.barbican_api_version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.550509] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.barbican_endpoint = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.550676] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.barbican_endpoint_type = public {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.550830] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.barbican_region_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.550983] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.cafile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.551150] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.certfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.551312] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.collect_timing = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.551503] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.insecure = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.551678] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.keyfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.551845] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.number_of_retries = 60 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.552016] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.retry_delay = 1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.552182] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.send_service_user_token = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.552342] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.split_loggers = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.552496] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.timeout = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.552658] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.verify_ssl = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.552814] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican.verify_ssl_path = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.552970] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican_service_user.auth_section = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.553139] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican_service_user.auth_type = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.553294] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican_service_user.cafile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.553452] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican_service_user.certfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.553637] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican_service_user.collect_timing = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.553798] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican_service_user.insecure = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.553952] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican_service_user.keyfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.554141] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican_service_user.split_loggers = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.554301] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] barbican_service_user.timeout = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.554743] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.approle_role_id = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.554743] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.approle_secret_id = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.554873] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.cafile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.554929] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.certfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.555097] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.collect_timing = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.555259] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.insecure = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.555426] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.keyfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.555594] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.kv_mountpoint = secret {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.555754] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.kv_version = 2 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.555908] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.namespace = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.556072] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.root_token_id = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.556233] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.split_loggers = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.556450] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.ssl_ca_crt_file = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.556622] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.timeout = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.556795] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.use_ssl = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.556952] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.557130] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.cafile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.557287] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.certfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.557449] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.collect_timing = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.557609] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.connect_retries = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.557768] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.connect_retry_delay = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.557926] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.endpoint_override = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.558107] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.insecure = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.558270] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.keyfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.558447] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.max_version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.558612] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.min_version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.558770] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.region_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.558925] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.service_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.559104] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.service_type = identity {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.559265] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.split_loggers = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.559437] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.status_code_retries = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.559604] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.status_code_retry_delay = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.559759] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.timeout = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.559934] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.560105] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] keystone.version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.560302] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.connection_uri = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.560460] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.cpu_mode = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.560629] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.cpu_model_extra_flags = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.560800] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.cpu_models = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.560964] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.cpu_power_governor_high = performance {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.561139] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.cpu_power_governor_low = powersave {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.561301] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.cpu_power_management = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.561470] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.561634] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.device_detach_attempts = 8 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.561794] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.device_detach_timeout = 20 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.561955] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.disk_cachemodes = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.562123] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.disk_prefix = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.562286] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.enabled_perf_events = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.562449] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.file_backed_memory = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.562614] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.gid_maps = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.562769] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.hw_disk_discard = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.562925] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.hw_machine_type = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.563115] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.images_rbd_ceph_conf = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.563284] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.563461] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.563656] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.images_rbd_glance_store_name = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.563831] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.images_rbd_pool = rbd {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.563995] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.images_type = default {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.564167] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.images_volume_group = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.564325] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.inject_key = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.564485] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.inject_partition = -2 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.564644] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.inject_password = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.564800] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.iscsi_iface = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.564958] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.iser_use_multipath = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.565127] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.live_migration_bandwidth = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.565288] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.565442] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.live_migration_downtime = 500 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.565600] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.565759] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.565911] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.live_migration_inbound_addr = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.566077] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.566235] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.live_migration_permit_post_copy = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.566405] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.live_migration_scheme = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.566576] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.live_migration_timeout_action = abort {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.566749] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.live_migration_tunnelled = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.566908] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.live_migration_uri = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.567081] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.live_migration_with_native_tls = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.567245] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.max_queues = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.567409] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.567565] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.nfs_mount_options = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.567866] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.568049] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.568221] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.num_iser_scan_tries = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.568394] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.num_memory_encrypted_guests = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.568568] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.568731] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.num_pcie_ports = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.568897] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.num_volume_scan_tries = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.569070] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.pmem_namespaces = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.569229] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.quobyte_client_cfg = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.569524] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.569700] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.rbd_connect_timeout = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.569863] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.570035] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.570198] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.rbd_secret_uuid = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.570354] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.rbd_user = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.570515] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.570684] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.remote_filesystem_transport = ssh {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.570842] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.rescue_image_id = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.570997] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.rescue_kernel_id = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.571183] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.rescue_ramdisk_id = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.571348] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.571505] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.rx_queue_size = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.571671] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.smbfs_mount_options = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.571940] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.572123] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.snapshot_compression = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.572285] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.snapshot_image_format = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.572499] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.572661] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.sparse_logical_volumes = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.572821] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.swtpm_enabled = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.572984] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.swtpm_group = tss {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.573164] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.swtpm_user = tss {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.573332] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.sysinfo_serial = unique {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.573486] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.tx_queue_size = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.573650] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.uid_maps = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.573808] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.use_virtio_for_bridges = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.573974] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.virt_type = kvm {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.574149] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.volume_clear = zero {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.574311] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.volume_clear_size = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.574475] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.volume_use_multipath = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.574631] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.vzstorage_cache_path = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.574795] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.574959] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.vzstorage_mount_group = qemu {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.575144] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.vzstorage_mount_opts = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.575311] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.575608] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.575793] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.vzstorage_mount_user = stack {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.575960] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.576146] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.auth_section = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.576317] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.auth_type = password {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.576477] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.cafile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.576633] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.certfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.576793] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.collect_timing = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.576945] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.connect_retries = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.577105] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.connect_retry_delay = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.577505] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.default_floating_pool = public {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.577505] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.endpoint_override = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.577576] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.extension_sync_interval = 600 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.577719] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.http_retries = 3 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.577876] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.insecure = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.578039] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.keyfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.578197] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.max_version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.578380] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.578549] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.min_version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.578715] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.ovs_bridge = br-int {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.578883] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.physnets = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.579062] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.region_name = RegionOne {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.579237] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.service_metadata_proxy = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.579428] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.service_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.579624] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.service_type = network {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.579788] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.split_loggers = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.579942] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.status_code_retries = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.580108] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.status_code_retry_delay = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.580264] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.timeout = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.580442] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.580602] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] neutron.version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.580773] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] notifications.bdms_in_notifications = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.580944] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] notifications.default_level = INFO {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.581133] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] notifications.notification_format = unversioned {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.581296] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] notifications.notify_on_state_change = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.581465] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.581638] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] pci.alias = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.581805] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] pci.device_spec = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.581966] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] pci.report_in_placement = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.582152] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.auth_section = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.582324] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.auth_type = password {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.582491] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.582650] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.cafile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.582804] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.certfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.582961] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.collect_timing = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.583141] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.connect_retries = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.583300] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.connect_retry_delay = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.583455] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.default_domain_id = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.583610] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.default_domain_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.583763] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.domain_id = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.583915] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.domain_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.584080] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.endpoint_override = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.584239] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.insecure = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.584393] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.keyfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.584547] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.max_version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.584699] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.min_version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.584862] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.password = **** {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.585025] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.project_domain_id = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.585190] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.project_domain_name = Default {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.585354] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.project_id = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.585534] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.project_name = service {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.585702] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.region_name = RegionOne {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.585858] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.service_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.586031] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.service_type = placement {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.586196] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.split_loggers = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.586354] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.status_code_retries = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.586511] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.status_code_retry_delay = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.586666] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.system_scope = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.586821] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.timeout = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.586975] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.trust_id = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.587153] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.user_domain_id = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.587319] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.user_domain_name = Default {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.587479] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.user_id = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.587652] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.username = placement {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.587830] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.587989] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] placement.version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.588203] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] quota.cores = 20 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.588396] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] quota.count_usage_from_placement = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.588584] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.588755] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] quota.injected_file_content_bytes = 10240 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.588919] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] quota.injected_file_path_length = 255 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.589097] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] quota.injected_files = 5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.589262] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] quota.instances = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.589424] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] quota.key_pairs = 100 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.589589] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] quota.metadata_items = 128 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.589749] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] quota.ram = 51200 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.589909] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] quota.recheck_quota = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.590087] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] quota.server_group_members = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.590251] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] quota.server_groups = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.590417] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] rdp.enabled = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.590741] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.590929] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.591121] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.591289] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] scheduler.image_metadata_prefilter = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.591454] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.591618] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] scheduler.max_attempts = 3 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.591779] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] scheduler.max_placement_results = 1000 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.591939] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.592111] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] scheduler.query_placement_for_availability_zone = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.592272] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] scheduler.query_placement_for_image_type_support = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.592430] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.592602] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] scheduler.workers = 2 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.592775] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.592943] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.593135] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.593299] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.593467] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.593632] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.593793] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.593976] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.594154] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.host_subset_size = 1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.594312] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.594473] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.594640] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.isolated_hosts = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.594800] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.isolated_images = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.594961] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.595146] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.595311] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.pci_in_placement = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.595474] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.595638] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.595798] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.595956] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.596131] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.596297] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.596459] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.track_instance_changes = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.596645] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.596814] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] metrics.required = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.596977] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] metrics.weight_multiplier = 1.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.597153] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.597316] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] metrics.weight_setting = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.597612] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.597786] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] serial_console.enabled = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.597959] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] serial_console.port_range = 10000:20000 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.598138] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.598303] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.598496] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] serial_console.serialproxy_port = 6083 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.598670] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] service_user.auth_section = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.598839] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] service_user.auth_type = password {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.599006] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] service_user.cafile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.599185] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] service_user.certfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.599355] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] service_user.collect_timing = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.599588] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] service_user.insecure = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.599766] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] service_user.keyfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.599941] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] service_user.send_service_user_token = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.600120] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] service_user.split_loggers = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.600278] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] service_user.timeout = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.600448] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] spice.agent_enabled = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.600626] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] spice.enabled = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.600937] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.601145] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.601314] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] spice.html5proxy_port = 6082 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.601476] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] spice.image_compression = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.601636] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] spice.jpeg_compression = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.601793] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] spice.playback_compression = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.601963] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] spice.server_listen = 127.0.0.1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.602143] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.602314] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] spice.streaming_mode = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.602547] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] spice.zlib_compression = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.602734] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] upgrade_levels.baseapi = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.602896] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] upgrade_levels.cert = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.603091] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] upgrade_levels.compute = auto {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.603258] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] upgrade_levels.conductor = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.603424] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] upgrade_levels.scheduler = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.603593] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vendordata_dynamic_auth.auth_section = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.603755] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vendordata_dynamic_auth.auth_type = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.603913] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vendordata_dynamic_auth.cafile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.604082] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vendordata_dynamic_auth.certfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.604247] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.604407] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vendordata_dynamic_auth.insecure = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.604564] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vendordata_dynamic_auth.keyfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.604722] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.604882] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vendordata_dynamic_auth.timeout = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.605080] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.api_retry_count = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.605246] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.ca_file = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.605415] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.cache_prefix = devstack-image-cache {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.605587] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.cluster_name = testcl1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.605749] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.connection_pool_size = 10 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.605906] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.console_delay_seconds = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.606080] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.datastore_regex = ^datastore.* {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.606286] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.606530] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.host_password = **** {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.606719] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.host_port = 443 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.606893] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.host_username = administrator@vsphere.local {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.607089] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.insecure = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.607257] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.integration_bridge = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.607423] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.maximum_objects = 100 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.607585] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.pbm_default_policy = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.607748] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.pbm_enabled = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.607906] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.pbm_wsdl_location = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.608085] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.608320] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.serial_port_proxy_uri = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.608416] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.serial_port_service_uri = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.608592] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.task_poll_interval = 0.5 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.608766] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.use_linked_clone = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.608936] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.vnc_keymap = en-us {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.609112] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.vnc_port = 5900 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.609278] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vmware.vnc_port_total = 10000 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.609485] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vnc.auth_schemes = ['none'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.609666] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vnc.enabled = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.609955] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.610150] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.610319] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vnc.novncproxy_port = 6080 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.610498] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vnc.server_listen = 127.0.0.1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.610676] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.610838] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vnc.vencrypt_ca_certs = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.610999] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vnc.vencrypt_client_cert = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.611185] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vnc.vencrypt_client_key = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.611367] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.611566] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.disable_deep_image_inspection = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.611737] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.611901] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.612075] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.612237] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.disable_rootwrap = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.612397] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.enable_numa_live_migration = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.612560] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.612720] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.612879] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.613049] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.libvirt_disable_apic = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.613212] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.613373] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.613535] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.613696] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.613855] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.614018] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.614180] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.614338] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.614504] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.614670] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.614850] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.615036] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] wsgi.client_socket_timeout = 900 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.615213] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] wsgi.default_pool_size = 1000 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.615378] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] wsgi.keep_alive = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.615544] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] wsgi.max_header_line = 16384 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.615703] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] wsgi.secure_proxy_ssl_header = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.615860] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] wsgi.ssl_ca_file = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.616026] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] wsgi.ssl_cert_file = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.616184] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] wsgi.ssl_key_file = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.616353] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] wsgi.tcp_keepidle = 600 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.616522] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.616680] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] zvm.ca_file = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.616837] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] zvm.cloud_connector_url = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.617123] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.617298] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] zvm.reachable_timeout = 300 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.617480] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_policy.enforce_new_defaults = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.617640] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_policy.enforce_scope = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.617811] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_policy.policy_default_rule = default {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.617992] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.618177] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_policy.policy_file = policy.yaml {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.618354] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.618536] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.618699] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.618857] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.619036] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.619213] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.619411] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.619604] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] profiler.connection_string = messaging:// {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.619774] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] profiler.enabled = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.619942] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] profiler.es_doc_type = notification {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.620120] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] profiler.es_scroll_size = 10000 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.620291] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] profiler.es_scroll_time = 2m {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.620451] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] profiler.filter_error_trace = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.620619] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] profiler.hmac_keys = SECRET_KEY {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.620782] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] profiler.sentinel_service_name = mymaster {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.620993] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] profiler.socket_timeout = 0.1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.621126] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] profiler.trace_sqlalchemy = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.621285] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] remote_debug.host = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.621445] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] remote_debug.port = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.621624] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.621788] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.621952] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.622124] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.622286] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.622446] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.622607] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.622766] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.622926] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.623105] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.623283] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.623514] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.623645] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.623819] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.623978] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.624189] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.624337] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.624502] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.624665] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.624832] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.624993] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.625167] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.625327] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.625489] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.625656] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.625822] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.ssl = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.625994] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.626174] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.626336] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.626534] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.626709] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_rabbit.ssl_version = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.626896] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.627072] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_notifications.retry = -1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.627254] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.627427] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_messaging_notifications.transport_url = **** {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.627594] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.auth_section = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.627750] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.auth_type = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.627906] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.cafile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.628086] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.certfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.628250] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.collect_timing = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.628442] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.connect_retries = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.628609] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.connect_retry_delay = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.628767] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.endpoint_id = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.628925] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.endpoint_override = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.629144] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.insecure = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.629323] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.keyfile = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.629475] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.max_version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.629632] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.min_version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.629785] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.region_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.629938] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.service_name = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.630104] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.service_type = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.630267] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.split_loggers = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.630419] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.status_code_retries = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.630575] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.status_code_retry_delay = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.630737] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.timeout = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.630886] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.valid_interfaces = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.631048] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_limit.version = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.631211] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_reports.file_event_handler = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.631373] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.631530] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] oslo_reports.log_dir = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.631701] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.631854] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.632014] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.632200] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.632362] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.632517] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.632683] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.632839] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vif_plug_ovs_privileged.group = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.632994] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.633172] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.633329] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.633483] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] vif_plug_ovs_privileged.user = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.633656] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] os_vif_linux_bridge.flat_interface = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.633824] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.633994] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.634172] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.634338] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.634501] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.634663] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.634821] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.634993] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] os_vif_ovs.isolate_vif = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.635170] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.635332] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.635508] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.635711] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] os_vif_ovs.ovsdb_interface = native {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.635877] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] os_vif_ovs.per_port_bridge = False {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.636061] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] os_brick.lock_path = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.636235] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] privsep_osbrick.capabilities = [21] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.636413] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] privsep_osbrick.group = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.636585] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] privsep_osbrick.helper_command = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.636751] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.636914] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.637083] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] privsep_osbrick.user = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.637255] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.637416] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] nova_sys_admin.group = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.637571] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] nova_sys_admin.helper_command = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.637730] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.637886] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.638048] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] nova_sys_admin.user = None {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 561.638178] env[59620]: DEBUG oslo_service.service [None req-04ad95f9-5007-4989-b955-f99dc1343618 None None] ******************************************************************************** {{(pid=59620) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 561.638598] env[59620]: INFO nova.service [-] Starting compute node (version 0.1.0) [ 561.650779] env[59620]: INFO nova.virt.node [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Generated node identity 40bba435-8384-412d-aa10-bdcf44760016 [ 561.651008] env[59620]: INFO nova.virt.node [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Wrote node identity 40bba435-8384-412d-aa10-bdcf44760016 to /opt/stack/data/n-cpu-1/compute_id [ 561.662482] env[59620]: WARNING nova.compute.manager [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Compute nodes ['40bba435-8384-412d-aa10-bdcf44760016'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 561.693601] env[59620]: INFO nova.compute.manager [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 561.716513] env[59620]: WARNING nova.compute.manager [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 561.716748] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 561.716948] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 561.717104] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 561.717429] env[59620]: DEBUG nova.compute.resource_tracker [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59620) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 561.718323] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f647b9b-128f-4eb8-8de3-46e4a7135e62 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.726778] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28e801fe-57a9-450d-972f-62d683544a2c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.740449] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a5a3b8c-de48-4986-ae62-98f3e697d74d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.746871] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3541c830-7891-457c-9e46-fab727a55c5a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.775161] env[59620]: DEBUG nova.compute.resource_tracker [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181492MB free_disk=136GB free_vcpus=48 pci_devices=None {{(pid=59620) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 561.775363] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 561.775490] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 561.787359] env[59620]: WARNING nova.compute.resource_tracker [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] No compute node record for cpu-1:40bba435-8384-412d-aa10-bdcf44760016: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 40bba435-8384-412d-aa10-bdcf44760016 could not be found. [ 561.800171] env[59620]: INFO nova.compute.resource_tracker [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 40bba435-8384-412d-aa10-bdcf44760016 [ 561.858712] env[59620]: DEBUG nova.compute.resource_tracker [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59620) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 561.859034] env[59620]: DEBUG nova.compute.resource_tracker [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59620) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 562.004414] env[59620]: INFO nova.scheduler.client.report [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] [req-e83d767b-564d-42d0-afde-a3e343b3c77c] Created resource provider record via placement API for resource provider with UUID 40bba435-8384-412d-aa10-bdcf44760016 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 562.020664] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bc704ad-0035-4288-93c9-f20ffd92b4da {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.028036] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ef22eb0-6591-4559-ae8b-966b3b7a29d2 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.056261] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62b7bcce-ade0-4539-8fa2-7fcac0d41c06 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.062931] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0be980cd-bd97-4c43-892c-cdda6b1a19e1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.075423] env[59620]: DEBUG nova.compute.provider_tree [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Updating inventory in ProviderTree for provider 40bba435-8384-412d-aa10-bdcf44760016 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 562.110041] env[59620]: DEBUG nova.scheduler.client.report [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Updated inventory for provider 40bba435-8384-412d-aa10-bdcf44760016 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 562.110275] env[59620]: DEBUG nova.compute.provider_tree [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Updating resource provider 40bba435-8384-412d-aa10-bdcf44760016 generation from 0 to 1 during operation: update_inventory {{(pid=59620) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 562.110414] env[59620]: DEBUG nova.compute.provider_tree [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Updating inventory in ProviderTree for provider 40bba435-8384-412d-aa10-bdcf44760016 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 562.152948] env[59620]: DEBUG nova.compute.provider_tree [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Updating resource provider 40bba435-8384-412d-aa10-bdcf44760016 generation from 1 to 2 during operation: update_traits {{(pid=59620) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 562.169315] env[59620]: DEBUG nova.compute.resource_tracker [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59620) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 562.169539] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.394s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.169702] env[59620]: DEBUG nova.service [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Creating RPC server for service compute {{(pid=59620) start /opt/stack/nova/nova/service.py:182}} [ 562.182782] env[59620]: DEBUG nova.service [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] Join ServiceGroup membership for this service compute {{(pid=59620) start /opt/stack/nova/nova/service.py:199}} [ 562.182963] env[59620]: DEBUG nova.servicegroup.drivers.db [None req-b46aa174-bdd8-43a5-85ac-e14585e7d39c None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=59620) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 595.041460] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "1c80719e-14e8-467f-9195-683f681b0fd1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 595.041767] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "1c80719e-14e8-467f-9195-683f681b0fd1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 595.057695] env[59620]: DEBUG nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 595.146578] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 595.146835] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 595.148653] env[59620]: INFO nova.compute.claims [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 595.257058] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a876bdfa-e2dd-48b4-8680-8dbc80e2d609 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.265323] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88fda845-a0ed-41bb-8bee-a65b919e73a3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.296217] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0dca23d-804d-4469-ac50-9a1bbfd1320b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.303894] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb9f6c0b-890f-4220-9340-eb2263b44bd1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.316920] env[59620]: DEBUG nova.compute.provider_tree [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 595.325523] env[59620]: DEBUG nova.scheduler.client.report [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 595.338194] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.191s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 595.338765] env[59620]: DEBUG nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 595.372876] env[59620]: DEBUG nova.compute.utils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 595.374305] env[59620]: DEBUG nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 595.374416] env[59620]: DEBUG nova.network.neutron [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 595.385405] env[59620]: DEBUG nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 595.452472] env[59620]: DEBUG nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 597.189625] env[59620]: DEBUG nova.virt.hardware [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 597.190020] env[59620]: DEBUG nova.virt.hardware [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 597.190020] env[59620]: DEBUG nova.virt.hardware [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 597.190306] env[59620]: DEBUG nova.virt.hardware [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 597.190306] env[59620]: DEBUG nova.virt.hardware [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 597.190451] env[59620]: DEBUG nova.virt.hardware [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 597.190761] env[59620]: DEBUG nova.virt.hardware [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 597.190827] env[59620]: DEBUG nova.virt.hardware [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 597.191229] env[59620]: DEBUG nova.virt.hardware [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 597.191318] env[59620]: DEBUG nova.virt.hardware [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 597.191509] env[59620]: DEBUG nova.virt.hardware [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 597.192419] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6772dbaa-7457-4cb0-9491-e5c6b9419806 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.202527] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e880c8ff-e793-4eb7-b466-f94c40f928c3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.220344] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58e23aa2-b7ed-4f9c-936c-3e0c8098b970 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.358549] env[59620]: DEBUG nova.policy [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bff99d63ab404ea09c032a89af2616e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3b081340a09140ed9283752785719b50', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 598.019030] env[59620]: DEBUG nova.network.neutron [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Successfully created port: db2ecbed-1855-42ba-bc68-ab748c2d4651 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 602.764292] env[59620]: ERROR nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port db2ecbed-1855-42ba-bc68-ab748c2d4651, please check neutron logs for more information. [ 602.764292] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 602.764292] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 602.764292] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 602.764292] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 602.764292] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 602.764292] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 602.764292] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 602.764292] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 602.764292] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 602.764292] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 602.764292] env[59620]: ERROR nova.compute.manager raise self.value [ 602.764292] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 602.764292] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 602.764292] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 602.764292] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 602.765673] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 602.765673] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 602.765673] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port db2ecbed-1855-42ba-bc68-ab748c2d4651, please check neutron logs for more information. [ 602.765673] env[59620]: ERROR nova.compute.manager [ 602.765673] env[59620]: Traceback (most recent call last): [ 602.765673] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 602.765673] env[59620]: listener.cb(fileno) [ 602.765673] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 602.765673] env[59620]: result = function(*args, **kwargs) [ 602.765673] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 602.765673] env[59620]: return func(*args, **kwargs) [ 602.765673] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 602.765673] env[59620]: raise e [ 602.765673] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 602.765673] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 602.765673] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 602.765673] env[59620]: created_port_ids = self._update_ports_for_instance( [ 602.765673] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 602.765673] env[59620]: with excutils.save_and_reraise_exception(): [ 602.765673] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 602.765673] env[59620]: self.force_reraise() [ 602.765673] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 602.765673] env[59620]: raise self.value [ 602.765673] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 602.765673] env[59620]: updated_port = self._update_port( [ 602.765673] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 602.765673] env[59620]: _ensure_no_port_binding_failure(port) [ 602.765673] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 602.765673] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 602.769350] env[59620]: nova.exception.PortBindingFailed: Binding failed for port db2ecbed-1855-42ba-bc68-ab748c2d4651, please check neutron logs for more information. [ 602.769350] env[59620]: Removing descriptor: 11 [ 602.769350] env[59620]: ERROR nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port db2ecbed-1855-42ba-bc68-ab748c2d4651, please check neutron logs for more information. [ 602.769350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Traceback (most recent call last): [ 602.769350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 602.769350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] yield resources [ 602.769350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 602.769350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] self.driver.spawn(context, instance, image_meta, [ 602.769350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 602.769350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 602.769350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 602.769350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] vm_ref = self.build_virtual_machine(instance, [ 602.769962] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 602.769962] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] vif_infos = vmwarevif.get_vif_info(self._session, [ 602.769962] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 602.769962] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] for vif in network_info: [ 602.769962] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 602.769962] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] return self._sync_wrapper(fn, *args, **kwargs) [ 602.769962] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 602.769962] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] self.wait() [ 602.769962] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 602.769962] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] self[:] = self._gt.wait() [ 602.769962] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 602.769962] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] return self._exit_event.wait() [ 602.769962] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 602.769962] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] result = hub.switch() [ 602.770397] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 602.770397] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] return self.greenlet.switch() [ 602.770397] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 602.770397] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] result = function(*args, **kwargs) [ 602.770397] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 602.770397] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] return func(*args, **kwargs) [ 602.770397] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 602.770397] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] raise e [ 602.770397] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 602.770397] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] nwinfo = self.network_api.allocate_for_instance( [ 602.770397] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 602.770397] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] created_port_ids = self._update_ports_for_instance( [ 602.770397] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 602.770759] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] with excutils.save_and_reraise_exception(): [ 602.770759] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 602.770759] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] self.force_reraise() [ 602.770759] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 602.770759] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] raise self.value [ 602.770759] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 602.770759] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] updated_port = self._update_port( [ 602.770759] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 602.770759] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] _ensure_no_port_binding_failure(port) [ 602.770759] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 602.770759] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] raise exception.PortBindingFailed(port_id=port['id']) [ 602.770759] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] nova.exception.PortBindingFailed: Binding failed for port db2ecbed-1855-42ba-bc68-ab748c2d4651, please check neutron logs for more information. [ 602.770759] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] [ 602.775293] env[59620]: INFO nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Terminating instance [ 602.775293] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "refresh_cache-1c80719e-14e8-467f-9195-683f681b0fd1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 602.775293] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquired lock "refresh_cache-1c80719e-14e8-467f-9195-683f681b0fd1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 602.775293] env[59620]: DEBUG nova.network.neutron [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 602.888683] env[59620]: DEBUG nova.network.neutron [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 603.088735] env[59620]: DEBUG nova.network.neutron [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 603.111174] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Releasing lock "refresh_cache-1c80719e-14e8-467f-9195-683f681b0fd1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 603.111174] env[59620]: DEBUG nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 603.111174] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 603.112828] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-dfb22b37-0564-45f2-abf6-ccc81435b1e4 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.130700] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5f0bbde-cb4c-4cb4-ae91-1d81fc1c9920 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.168024] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1c80719e-14e8-467f-9195-683f681b0fd1 could not be found. [ 603.168024] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 603.168024] env[59620]: INFO nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Took 0.06 seconds to destroy the instance on the hypervisor. [ 603.168024] env[59620]: DEBUG oslo.service.loopingcall [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 603.168024] env[59620]: DEBUG nova.compute.manager [-] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 603.168268] env[59620]: DEBUG nova.network.neutron [-] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 603.201792] env[59620]: DEBUG nova.network.neutron [-] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 603.218879] env[59620]: DEBUG nova.network.neutron [-] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 603.236383] env[59620]: INFO nova.compute.manager [-] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Took 0.07 seconds to deallocate network for instance. [ 603.241584] env[59620]: DEBUG nova.compute.claims [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 603.241895] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.242031] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.331446] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3b92ee6-23be-46b2-9222-5f700bcfff7c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.341150] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d24c809-adf1-4922-ba53-3f677eb95a19 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.377216] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a8416b2-e80d-42f9-bcea-c6d054a75c8d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.386877] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1011c52a-ed73-4cdc-9ff1-1298393f42ea {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.401200] env[59620]: DEBUG nova.compute.provider_tree [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 603.412562] env[59620]: DEBUG nova.scheduler.client.report [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 603.433616] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.191s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.434350] env[59620]: ERROR nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port db2ecbed-1855-42ba-bc68-ab748c2d4651, please check neutron logs for more information. [ 603.434350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Traceback (most recent call last): [ 603.434350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 603.434350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] self.driver.spawn(context, instance, image_meta, [ 603.434350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 603.434350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 603.434350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 603.434350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] vm_ref = self.build_virtual_machine(instance, [ 603.434350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 603.434350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] vif_infos = vmwarevif.get_vif_info(self._session, [ 603.434350] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 603.434698] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] for vif in network_info: [ 603.434698] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 603.434698] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] return self._sync_wrapper(fn, *args, **kwargs) [ 603.434698] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 603.434698] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] self.wait() [ 603.434698] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 603.434698] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] self[:] = self._gt.wait() [ 603.434698] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 603.434698] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] return self._exit_event.wait() [ 603.434698] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 603.434698] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] result = hub.switch() [ 603.434698] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 603.434698] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] return self.greenlet.switch() [ 603.434698] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 603.435120] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] result = function(*args, **kwargs) [ 603.435120] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 603.435120] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] return func(*args, **kwargs) [ 603.435120] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 603.435120] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] raise e [ 603.435120] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 603.435120] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] nwinfo = self.network_api.allocate_for_instance( [ 603.435120] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 603.435120] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] created_port_ids = self._update_ports_for_instance( [ 603.435120] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 603.435120] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] with excutils.save_and_reraise_exception(): [ 603.435120] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 603.435120] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] self.force_reraise() [ 603.435448] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 603.435448] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] raise self.value [ 603.435448] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 603.435448] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] updated_port = self._update_port( [ 603.435448] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 603.435448] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] _ensure_no_port_binding_failure(port) [ 603.435448] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 603.435448] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] raise exception.PortBindingFailed(port_id=port['id']) [ 603.435448] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] nova.exception.PortBindingFailed: Binding failed for port db2ecbed-1855-42ba-bc68-ab748c2d4651, please check neutron logs for more information. [ 603.435448] env[59620]: ERROR nova.compute.manager [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] [ 603.435448] env[59620]: DEBUG nova.compute.utils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Binding failed for port db2ecbed-1855-42ba-bc68-ab748c2d4651, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 603.440976] env[59620]: DEBUG nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Build of instance 1c80719e-14e8-467f-9195-683f681b0fd1 was re-scheduled: Binding failed for port db2ecbed-1855-42ba-bc68-ab748c2d4651, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 603.441605] env[59620]: DEBUG nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 603.441870] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "refresh_cache-1c80719e-14e8-467f-9195-683f681b0fd1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 603.442374] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquired lock "refresh_cache-1c80719e-14e8-467f-9195-683f681b0fd1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 603.442658] env[59620]: DEBUG nova.network.neutron [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 603.498628] env[59620]: DEBUG nova.network.neutron [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 603.855770] env[59620]: DEBUG nova.network.neutron [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 603.876038] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Releasing lock "refresh_cache-1c80719e-14e8-467f-9195-683f681b0fd1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 603.876038] env[59620]: DEBUG nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 603.876038] env[59620]: DEBUG nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 603.876038] env[59620]: DEBUG nova.network.neutron [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 603.913427] env[59620]: DEBUG nova.network.neutron [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 603.923902] env[59620]: DEBUG nova.network.neutron [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 603.935235] env[59620]: INFO nova.compute.manager [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 1c80719e-14e8-467f-9195-683f681b0fd1] Took 0.06 seconds to deallocate network for instance. [ 604.049247] env[59620]: INFO nova.scheduler.client.report [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Deleted allocations for instance 1c80719e-14e8-467f-9195-683f681b0fd1 [ 604.080438] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0cfd1e4c-7ac0-4d6a-a73d-ff320bba0ee9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "1c80719e-14e8-467f-9195-683f681b0fd1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.039s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.979075] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquiring lock "2f8a30ee-d22b-42e6-abe6-db22d9a6fe73" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.979075] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "2f8a30ee-d22b-42e6-abe6-db22d9a6fe73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 606.001048] env[59620]: DEBUG nova.compute.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 606.068629] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 606.068709] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 606.072264] env[59620]: INFO nova.compute.claims [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 606.185982] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._sync_power_states {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 606.205666] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Getting list of instances from cluster (obj){ [ 606.205666] env[59620]: value = "domain-c8" [ 606.205666] env[59620]: _type = "ClusterComputeResource" [ 606.205666] env[59620]: } {{(pid=59620) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 606.205666] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91300481-0c0b-4851-ae81-1055b35a7515 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.215078] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d23b1ac-29bf-4aed-ad84-ffb953bc05e1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.226230] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Got total of 0 instances {{(pid=59620) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 606.226823] env[59620]: WARNING nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor. [ 606.227102] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Triggering sync for uuid 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73 {{(pid=59620) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 606.227924] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Acquiring lock "2f8a30ee-d22b-42e6-abe6-db22d9a6fe73" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 606.228575] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 606.229215] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Getting list of instances from cluster (obj){ [ 606.229215] env[59620]: value = "domain-c8" [ 606.229215] env[59620]: _type = "ClusterComputeResource" [ 606.229215] env[59620]: } {{(pid=59620) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 606.234100] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14a8a99b-5d8d-494c-960f-40e05facafa3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.244935] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a57e5b92-da19-4b72-8506-cc87092cdaa8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.249659] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Got total of 0 instances {{(pid=59620) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 606.282804] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d01d4e1b-67b5-440c-8bbb-b3d9da982d19 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.289641] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d81ca18a-38a6-4d76-914b-578b343df54f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.303732] env[59620]: DEBUG nova.compute.provider_tree [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 606.319280] env[59620]: DEBUG nova.scheduler.client.report [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 606.338717] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.269s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 606.338717] env[59620]: DEBUG nova.compute.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 606.396976] env[59620]: DEBUG nova.compute.utils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 606.398367] env[59620]: DEBUG nova.compute.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Not allocating networking since 'none' was specified. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 606.419090] env[59620]: DEBUG nova.compute.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 606.520748] env[59620]: DEBUG nova.compute.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 606.553061] env[59620]: DEBUG nova.virt.hardware [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 606.553292] env[59620]: DEBUG nova.virt.hardware [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 606.553437] env[59620]: DEBUG nova.virt.hardware [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 606.553605] env[59620]: DEBUG nova.virt.hardware [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 606.553779] env[59620]: DEBUG nova.virt.hardware [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 606.553871] env[59620]: DEBUG nova.virt.hardware [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 606.556447] env[59620]: DEBUG nova.virt.hardware [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 606.556748] env[59620]: DEBUG nova.virt.hardware [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 606.556901] env[59620]: DEBUG nova.virt.hardware [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 606.557026] env[59620]: DEBUG nova.virt.hardware [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 606.557196] env[59620]: DEBUG nova.virt.hardware [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 606.558087] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af5a600d-9e92-4c3c-be5e-3983655d643d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.573093] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-919d15eb-c0f5-4110-959b-007b2faec668 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.594559] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Instance VIF info [] {{(pid=59620) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 606.610619] env[59620]: DEBUG nova.virt.vmwareapi.vm_util [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=59620) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.611379] env[59620]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f3e4614b-3400-49f0-8d10-98623327f0e8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.625650] env[59620]: INFO nova.virt.vmwareapi.vm_util [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Created folder: OpenStack in parent group-v4. [ 606.625846] env[59620]: DEBUG nova.virt.vmwareapi.vm_util [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Creating folder: Project (9468510ce958424da5fc4ec68a07d6e9). Parent ref: group-v280263. {{(pid=59620) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.626101] env[59620]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-09d4f58c-f2f8-492c-8a8a-b39ff4f3128b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.634907] env[59620]: INFO nova.virt.vmwareapi.vm_util [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Created folder: Project (9468510ce958424da5fc4ec68a07d6e9) in parent group-v280263. [ 606.635102] env[59620]: DEBUG nova.virt.vmwareapi.vm_util [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Creating folder: Instances. Parent ref: group-v280264. {{(pid=59620) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.635327] env[59620]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-eaf7a7fb-6666-4197-b3bd-ef37c020a678 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.650195] env[59620]: INFO nova.virt.vmwareapi.vm_util [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Created folder: Instances in parent group-v280264. [ 606.650456] env[59620]: DEBUG oslo.service.loopingcall [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 606.650640] env[59620]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Creating VM on the ESX host {{(pid=59620) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 606.650911] env[59620]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2cf49e1f-b5a0-4951-8288-47a91c108845 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.671368] env[59620]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 606.671368] env[59620]: value = "task-1308612" [ 606.671368] env[59620]: _type = "Task" [ 606.671368] env[59620]: } to complete. {{(pid=59620) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 606.683937] env[59620]: DEBUG oslo_vmware.api [-] Task: {'id': task-1308612, 'name': CreateVM_Task} progress is 0%. {{(pid=59620) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 607.184585] env[59620]: DEBUG oslo_vmware.api [-] Task: {'id': task-1308612, 'name': CreateVM_Task, 'duration_secs': 0.270441} completed successfully. {{(pid=59620) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 607.184585] env[59620]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Created VM on the ESX host {{(pid=59620) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 607.185204] env[59620]: DEBUG oslo_vmware.service [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a8949ce-7dee-4508-b36a-6ba8b6fe9153 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.196898] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2efa4364-ba59-4de9-978f-169a769ee710" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 607.196898] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2efa4364-ba59-4de9-978f-169a769ee710" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 607.196898] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2efa4364-ba59-4de9-978f-169a769ee710" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 607.197141] env[59620]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3170a5a9-a74a-438b-b218-5b1072f92b98 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.201526] env[59620]: DEBUG oslo_vmware.api [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Waiting for the task: (returnval){ [ 607.201526] env[59620]: value = "session[52dd88a5-b9b5-fd00-4be4-1e40bbf1fd5a]526cfc2e-dc31-f835-22cd-aab1d1d2e5df" [ 607.201526] env[59620]: _type = "Task" [ 607.201526] env[59620]: } to complete. {{(pid=59620) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 607.210613] env[59620]: DEBUG oslo_vmware.api [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Task: {'id': session[52dd88a5-b9b5-fd00-4be4-1e40bbf1fd5a]526cfc2e-dc31-f835-22cd-aab1d1d2e5df, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59620) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 607.716685] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2efa4364-ba59-4de9-978f-169a769ee710" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 607.716685] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Processing image 2efa4364-ba59-4de9-978f-169a769ee710 {{(pid=59620) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 607.716685] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2efa4364-ba59-4de9-978f-169a769ee710/2efa4364-ba59-4de9-978f-169a769ee710.vmdk" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 607.716685] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2efa4364-ba59-4de9-978f-169a769ee710/2efa4364-ba59-4de9-978f-169a769ee710.vmdk" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 607.716886] env[59620]: DEBUG nova.virt.vmwareapi.ds_util [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59620) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 607.716886] env[59620]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-83951c7b-ce80-4c5d-8349-7cfcdaf49d1d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.732901] env[59620]: DEBUG nova.virt.vmwareapi.ds_util [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59620) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 607.733090] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59620) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 607.734218] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81f8064b-fd46-4ad3-8079-01cbc6c7e6fb {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.742690] env[59620]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e64200e2-4321-4c49-9748-d179eaa4387d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.751577] env[59620]: DEBUG oslo_vmware.api [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Waiting for the task: (returnval){ [ 607.751577] env[59620]: value = "session[52dd88a5-b9b5-fd00-4be4-1e40bbf1fd5a]52d3e5d6-2f72-3b4d-2f0f-0e2db0e845e4" [ 607.751577] env[59620]: _type = "Task" [ 607.751577] env[59620]: } to complete. {{(pid=59620) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 607.760709] env[59620]: DEBUG oslo_vmware.api [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Task: {'id': session[52dd88a5-b9b5-fd00-4be4-1e40bbf1fd5a]52d3e5d6-2f72-3b4d-2f0f-0e2db0e845e4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59620) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 608.261974] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Preparing fetch location {{(pid=59620) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 608.262281] env[59620]: DEBUG nova.virt.vmwareapi.ds_util [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Creating directory with path [datastore1] vmware_temp/babc7654-fad0-49d7-893b-2b5cb64b39b0/2efa4364-ba59-4de9-978f-169a769ee710 {{(pid=59620) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 608.262460] env[59620]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e0e3d83e-bb59-4100-b7b6-c3c864261369 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.284192] env[59620]: DEBUG nova.virt.vmwareapi.ds_util [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Created directory with path [datastore1] vmware_temp/babc7654-fad0-49d7-893b-2b5cb64b39b0/2efa4364-ba59-4de9-978f-169a769ee710 {{(pid=59620) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 608.284393] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Fetch image to [datastore1] vmware_temp/babc7654-fad0-49d7-893b-2b5cb64b39b0/2efa4364-ba59-4de9-978f-169a769ee710/tmp-sparse.vmdk {{(pid=59620) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 608.284667] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Downloading image file data 2efa4364-ba59-4de9-978f-169a769ee710 to [datastore1] vmware_temp/babc7654-fad0-49d7-893b-2b5cb64b39b0/2efa4364-ba59-4de9-978f-169a769ee710/tmp-sparse.vmdk on the data store datastore1 {{(pid=59620) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 608.285602] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdb16e11-704d-4fbc-92d7-e433dce68cc5 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.293620] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3aa0f8ea-3ca2-4beb-86cb-28be3d81b86f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.305975] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b75c748-8066-471b-bbf7-2862a4a40109 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.346512] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9672e4cf-4d7f-4a70-b5b1-1b48e6724b02 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.352531] env[59620]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7b7f5472-1fae-400b-8a51-b3448d354e3d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.381708] env[59620]: DEBUG nova.virt.vmwareapi.images [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Downloading image file data 2efa4364-ba59-4de9-978f-169a769ee710 to the data store datastore1 {{(pid=59620) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 608.389673] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Acquiring lock "89af0723-c7fc-4d6f-90f5-6f69e7a3630b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.389912] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "89af0723-c7fc-4d6f-90f5-6f69e7a3630b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.405125] env[59620]: DEBUG nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 608.457499] env[59620]: DEBUG oslo_vmware.rw_handles [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/babc7654-fad0-49d7-893b-2b5cb64b39b0/2efa4364-ba59-4de9-978f-169a769ee710/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59620) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 608.519278] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.519519] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.522128] env[59620]: INFO nova.compute.claims [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 608.525520] env[59620]: DEBUG oslo_vmware.rw_handles [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Completed reading data from the image iterator. {{(pid=59620) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 608.525685] env[59620]: DEBUG oslo_vmware.rw_handles [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/babc7654-fad0-49d7-893b-2b5cb64b39b0/2efa4364-ba59-4de9-978f-169a769ee710/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59620) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 608.670482] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fb758d0-2577-4bdf-bdb8-242736dc5b2a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.685256] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ce31abc-2b2f-4a6d-99d8-3bbcb06128e0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.727367] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5243046-dec2-4b1b-8c4f-b46f1d152a31 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.735745] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc658f1e-5ed3-4184-b9b3-c896d3b3e038 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.750889] env[59620]: DEBUG nova.compute.provider_tree [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 608.764609] env[59620]: DEBUG nova.scheduler.client.report [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 608.792021] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.792021] env[59620]: DEBUG nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 608.830778] env[59620]: DEBUG nova.compute.utils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 608.832142] env[59620]: DEBUG nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 608.832463] env[59620]: DEBUG nova.network.neutron [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 608.859211] env[59620]: DEBUG nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 608.941973] env[59620]: DEBUG nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 608.980267] env[59620]: DEBUG nova.virt.hardware [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 608.980448] env[59620]: DEBUG nova.virt.hardware [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 608.980593] env[59620]: DEBUG nova.virt.hardware [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 608.980761] env[59620]: DEBUG nova.virt.hardware [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 608.980895] env[59620]: DEBUG nova.virt.hardware [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 608.981212] env[59620]: DEBUG nova.virt.hardware [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 608.981471] env[59620]: DEBUG nova.virt.hardware [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 608.981471] env[59620]: DEBUG nova.virt.hardware [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 608.981598] env[59620]: DEBUG nova.virt.hardware [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 608.981678] env[59620]: DEBUG nova.virt.hardware [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 608.981875] env[59620]: DEBUG nova.virt.hardware [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 608.983100] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cca6c877-1859-4bea-af13-064050e45c37 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.995350] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a768b2a-0d36-4a4a-a73b-5143422c65e6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.222817] env[59620]: DEBUG nova.policy [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c96ee8fc6a5445098644e375c7df4919', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec0cec8728ec40b286d68178a70794e8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 610.398071] env[59620]: DEBUG nova.network.neutron [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Successfully created port: 30aab2fb-c3eb-4490-85a0-503a72da63d2 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 610.475801] env[59620]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Acquiring lock "80421e87-c5bb-4eae-acd0-fa2ce12d8bbe" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.476024] env[59620]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "80421e87-c5bb-4eae-acd0-fa2ce12d8bbe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.496017] env[59620]: DEBUG nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 610.573877] env[59620]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 610.573877] env[59620]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 610.574297] env[59620]: INFO nova.compute.claims [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 610.726849] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4db610b-872e-49bf-aabb-a10e9bb5148c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.740662] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e037bc80-972c-4f14-96d0-befa77954bf0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.775411] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27c1b62a-9485-48e2-acf7-91fdf9cef059 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.783177] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36e05ee1-a77a-4fd8-b509-1303c7bc8010 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.796555] env[59620]: DEBUG nova.compute.provider_tree [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 610.817364] env[59620]: DEBUG nova.scheduler.client.report [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 610.829784] env[59620]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 610.830676] env[59620]: DEBUG nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 610.878878] env[59620]: DEBUG nova.compute.utils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 610.879949] env[59620]: DEBUG nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 610.880494] env[59620]: DEBUG nova.network.neutron [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 610.890088] env[59620]: DEBUG nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 610.973124] env[59620]: DEBUG nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 611.011679] env[59620]: DEBUG nova.virt.hardware [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 611.011908] env[59620]: DEBUG nova.virt.hardware [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 611.012064] env[59620]: DEBUG nova.virt.hardware [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 611.012236] env[59620]: DEBUG nova.virt.hardware [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 611.012681] env[59620]: DEBUG nova.virt.hardware [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 611.012918] env[59620]: DEBUG nova.virt.hardware [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 611.013212] env[59620]: DEBUG nova.virt.hardware [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 611.013407] env[59620]: DEBUG nova.virt.hardware [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 611.013628] env[59620]: DEBUG nova.virt.hardware [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 611.013873] env[59620]: DEBUG nova.virt.hardware [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 611.014134] env[59620]: DEBUG nova.virt.hardware [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 611.015268] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3b13f7f-665b-413c-b868-d5604d337765 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.025603] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b0b8d34-0aa7-4410-8790-d3ed361b004f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.101927] env[59620]: DEBUG nova.policy [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2f73951d67dc467a96738f87697c6f62', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '64a2dca870f64583ae77cb64d7eff903', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 611.588474] env[59620]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "757b0e86-7d50-46c8-b69a-7e729d925cb1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.588873] env[59620]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "757b0e86-7d50-46c8-b69a-7e729d925cb1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.605982] env[59620]: DEBUG nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 611.670428] env[59620]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 611.670428] env[59620]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 611.671824] env[59620]: INFO nova.compute.claims [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 611.822905] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77316270-1d99-4dbd-842c-4c2c616d2b41 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.835021] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa692367-b199-4be0-bc34-d8d885725dbc {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.872191] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbe22f79-5ebe-4271-93e4-8d72d4afe290 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.880333] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f5aa2d9-6dc2-4a81-9d15-1f53ca52d862 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.895339] env[59620]: DEBUG nova.compute.provider_tree [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 611.905544] env[59620]: DEBUG nova.scheduler.client.report [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 611.929202] env[59620]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 611.932588] env[59620]: DEBUG nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 611.973720] env[59620]: DEBUG nova.compute.utils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 611.975757] env[59620]: DEBUG nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 611.975959] env[59620]: DEBUG nova.network.neutron [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 611.987061] env[59620]: DEBUG nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 612.061427] env[59620]: DEBUG nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 612.090194] env[59620]: DEBUG nova.virt.hardware [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 612.090524] env[59620]: DEBUG nova.virt.hardware [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 612.090591] env[59620]: DEBUG nova.virt.hardware [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 612.091152] env[59620]: DEBUG nova.virt.hardware [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 612.091330] env[59620]: DEBUG nova.virt.hardware [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 612.091447] env[59620]: DEBUG nova.virt.hardware [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 612.091713] env[59620]: DEBUG nova.virt.hardware [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 612.092560] env[59620]: DEBUG nova.virt.hardware [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 612.093603] env[59620]: DEBUG nova.virt.hardware [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 612.093603] env[59620]: DEBUG nova.virt.hardware [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 612.093733] env[59620]: DEBUG nova.virt.hardware [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 612.094622] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-935dac66-20c8-40bf-b085-8c43139d19d4 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.104597] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe1d0527-a5c8-4535-8425-6b51e5328560 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.307862] env[59620]: DEBUG nova.policy [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bff99d63ab404ea09c032a89af2616e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3b081340a09140ed9283752785719b50', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 613.381158] env[59620]: DEBUG nova.network.neutron [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Successfully created port: 438a9f07-b733-41d5-a82a-a560eeadf95c {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 613.762773] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "05155891-6002-4ac0-8386-62e8db523152" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.762890] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "05155891-6002-4ac0-8386-62e8db523152" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.802442] env[59620]: DEBUG nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 613.867628] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.867869] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.869446] env[59620]: INFO nova.compute.claims [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 614.042156] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8beee417-2c71-490a-8090-e19234b6d79f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.050633] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5785bff5-c275-4aa1-9260-34aa2a12de63 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.081992] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f8b9e4d-09c1-4927-9404-a634b9b19e90 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.090650] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-283598db-51dd-4643-b9a5-0047582bd487 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.106491] env[59620]: DEBUG nova.compute.provider_tree [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 614.118751] env[59620]: DEBUG nova.scheduler.client.report [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 614.139243] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.269s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 614.139243] env[59620]: DEBUG nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 614.192213] env[59620]: DEBUG nova.compute.utils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 614.193488] env[59620]: DEBUG nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 614.193808] env[59620]: DEBUG nova.network.neutron [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 614.206392] env[59620]: DEBUG nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 614.293740] env[59620]: DEBUG nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 614.338365] env[59620]: DEBUG nova.virt.hardware [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 614.339941] env[59620]: DEBUG nova.virt.hardware [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 614.339941] env[59620]: DEBUG nova.virt.hardware [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 614.339941] env[59620]: DEBUG nova.virt.hardware [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 614.339941] env[59620]: DEBUG nova.virt.hardware [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 614.339941] env[59620]: DEBUG nova.virt.hardware [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 614.340316] env[59620]: DEBUG nova.virt.hardware [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 614.340316] env[59620]: DEBUG nova.virt.hardware [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 614.340316] env[59620]: DEBUG nova.virt.hardware [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 614.340316] env[59620]: DEBUG nova.virt.hardware [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 614.340436] env[59620]: DEBUG nova.virt.hardware [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 614.341276] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edb822fe-cfd2-4ba7-b262-da45f45c3e32 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.355089] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7b2d2f8-82ff-462c-bc50-89e6cf4ff6f1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.379330] env[59620]: DEBUG nova.network.neutron [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Successfully created port: b10348ce-ec95-49ff-bce7-29c566656dd9 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 614.686909] env[59620]: DEBUG nova.policy [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9ca548ffb6fe4cd9a91d5a743fd851cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1c65b725b7474926832d3ffd92af67db', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 615.468053] env[59620]: ERROR nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 30aab2fb-c3eb-4490-85a0-503a72da63d2, please check neutron logs for more information. [ 615.468053] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 615.468053] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 615.468053] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 615.468053] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 615.468053] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 615.468053] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 615.468053] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 615.468053] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 615.468053] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 615.468053] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 615.468053] env[59620]: ERROR nova.compute.manager raise self.value [ 615.468053] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 615.468053] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 615.468053] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 615.468053] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 615.468858] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 615.468858] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 615.468858] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 30aab2fb-c3eb-4490-85a0-503a72da63d2, please check neutron logs for more information. [ 615.468858] env[59620]: ERROR nova.compute.manager [ 615.468858] env[59620]: Traceback (most recent call last): [ 615.468858] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 615.468858] env[59620]: listener.cb(fileno) [ 615.468858] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 615.468858] env[59620]: result = function(*args, **kwargs) [ 615.468858] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 615.468858] env[59620]: return func(*args, **kwargs) [ 615.468858] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 615.468858] env[59620]: raise e [ 615.468858] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 615.468858] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 615.468858] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 615.468858] env[59620]: created_port_ids = self._update_ports_for_instance( [ 615.468858] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 615.468858] env[59620]: with excutils.save_and_reraise_exception(): [ 615.468858] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 615.468858] env[59620]: self.force_reraise() [ 615.468858] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 615.468858] env[59620]: raise self.value [ 615.468858] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 615.468858] env[59620]: updated_port = self._update_port( [ 615.468858] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 615.468858] env[59620]: _ensure_no_port_binding_failure(port) [ 615.468858] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 615.468858] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 615.469722] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 30aab2fb-c3eb-4490-85a0-503a72da63d2, please check neutron logs for more information. [ 615.469722] env[59620]: Removing descriptor: 11 [ 615.469722] env[59620]: ERROR nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 30aab2fb-c3eb-4490-85a0-503a72da63d2, please check neutron logs for more information. [ 615.469722] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Traceback (most recent call last): [ 615.469722] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 615.469722] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] yield resources [ 615.469722] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 615.469722] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] self.driver.spawn(context, instance, image_meta, [ 615.469722] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 615.469722] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 615.469722] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 615.469722] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] vm_ref = self.build_virtual_machine(instance, [ 615.470115] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 615.470115] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] vif_infos = vmwarevif.get_vif_info(self._session, [ 615.470115] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 615.470115] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] for vif in network_info: [ 615.470115] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 615.470115] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] return self._sync_wrapper(fn, *args, **kwargs) [ 615.470115] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 615.470115] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] self.wait() [ 615.470115] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 615.470115] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] self[:] = self._gt.wait() [ 615.470115] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 615.470115] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] return self._exit_event.wait() [ 615.470115] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 615.470115] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] result = hub.switch() [ 615.470470] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 615.470470] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] return self.greenlet.switch() [ 615.470470] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 615.470470] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] result = function(*args, **kwargs) [ 615.470470] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 615.470470] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] return func(*args, **kwargs) [ 615.470470] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 615.470470] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] raise e [ 615.470470] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 615.470470] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] nwinfo = self.network_api.allocate_for_instance( [ 615.470470] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 615.470470] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] created_port_ids = self._update_ports_for_instance( [ 615.470470] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 615.470812] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] with excutils.save_and_reraise_exception(): [ 615.470812] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 615.470812] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] self.force_reraise() [ 615.470812] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 615.470812] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] raise self.value [ 615.470812] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 615.470812] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] updated_port = self._update_port( [ 615.470812] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 615.470812] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] _ensure_no_port_binding_failure(port) [ 615.470812] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 615.470812] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] raise exception.PortBindingFailed(port_id=port['id']) [ 615.470812] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] nova.exception.PortBindingFailed: Binding failed for port 30aab2fb-c3eb-4490-85a0-503a72da63d2, please check neutron logs for more information. [ 615.470812] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] [ 615.471237] env[59620]: INFO nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Terminating instance [ 615.472325] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Acquiring lock "refresh_cache-89af0723-c7fc-4d6f-90f5-6f69e7a3630b" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 615.472587] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Acquired lock "refresh_cache-89af0723-c7fc-4d6f-90f5-6f69e7a3630b" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 615.473197] env[59620]: DEBUG nova.network.neutron [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 615.529156] env[59620]: DEBUG nova.network.neutron [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 615.538379] env[59620]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Acquiring lock "4a43bc91-94d5-46b4-8e29-e8a02d98249f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.538379] env[59620]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "4a43bc91-94d5-46b4-8e29-e8a02d98249f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.554496] env[59620]: DEBUG nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 615.631697] env[59620]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.632735] env[59620]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.635894] env[59620]: INFO nova.compute.claims [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 615.849912] env[59620]: DEBUG nova.network.neutron [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 615.854721] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae080528-187d-4eb4-bf48-d2fa0266b621 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.862395] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Releasing lock "refresh_cache-89af0723-c7fc-4d6f-90f5-6f69e7a3630b" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 615.863251] env[59620]: DEBUG nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 615.863583] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 615.864422] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-984641c5-908c-41b3-be10-15897fb78f8d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.875583] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-737daa3e-5d51-47e7-9b7c-ec21418346ee {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.884486] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c334aff-7183-4dab-99fb-c8d0315119e1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.945324] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6450a0c-d27f-41b9-8a61-9e21d2d92f15 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.948677] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 89af0723-c7fc-4d6f-90f5-6f69e7a3630b could not be found. [ 615.949100] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 615.949770] env[59620]: INFO nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Took 0.09 seconds to destroy the instance on the hypervisor. [ 615.949770] env[59620]: DEBUG oslo.service.loopingcall [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 615.950129] env[59620]: DEBUG nova.compute.manager [-] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 615.950302] env[59620]: DEBUG nova.network.neutron [-] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 615.960144] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5aa4e7f-25b3-4584-a516-bcdfbde98d2c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.986722] env[59620]: DEBUG nova.compute.provider_tree [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 615.992559] env[59620]: DEBUG nova.network.neutron [-] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 616.001372] env[59620]: DEBUG nova.scheduler.client.report [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 616.013769] env[59620]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.382s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 616.014281] env[59620]: DEBUG nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 616.018763] env[59620]: DEBUG nova.network.neutron [-] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 616.030996] env[59620]: INFO nova.compute.manager [-] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Took 0.08 seconds to deallocate network for instance. [ 616.034608] env[59620]: DEBUG nova.compute.claims [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 616.034608] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 616.034608] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 616.075234] env[59620]: DEBUG nova.compute.utils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 616.076383] env[59620]: DEBUG nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 616.076687] env[59620]: DEBUG nova.network.neutron [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 616.089911] env[59620]: DEBUG nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 616.191162] env[59620]: DEBUG nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 616.219927] env[59620]: DEBUG nova.virt.hardware [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 616.220715] env[59620]: DEBUG nova.virt.hardware [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 616.220715] env[59620]: DEBUG nova.virt.hardware [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 616.220715] env[59620]: DEBUG nova.virt.hardware [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 616.220715] env[59620]: DEBUG nova.virt.hardware [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 616.220973] env[59620]: DEBUG nova.virt.hardware [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 616.220973] env[59620]: DEBUG nova.virt.hardware [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 616.221141] env[59620]: DEBUG nova.virt.hardware [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 616.221297] env[59620]: DEBUG nova.virt.hardware [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 616.224910] env[59620]: DEBUG nova.virt.hardware [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 616.224910] env[59620]: DEBUG nova.virt.hardware [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 616.224910] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5cd3abf-6849-4a7e-9123-560850cc5d8b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.230605] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bbd74fa-8216-4e4f-bc23-eecef9a697f1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.252454] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb01d773-2964-4886-ad01-07d0cf0fade9 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.259649] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b1c06a0-fe61-428a-8d54-21a6caeb7f95 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.291868] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71cae62e-e195-49b0-8449-fbdc7d011894 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.299589] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b51d8160-002f-4d1b-a2e3-558583bf6a44 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.315163] env[59620]: DEBUG nova.compute.provider_tree [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 616.323826] env[59620]: DEBUG nova.scheduler.client.report [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 616.352548] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.318s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 616.353211] env[59620]: ERROR nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 30aab2fb-c3eb-4490-85a0-503a72da63d2, please check neutron logs for more information. [ 616.353211] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Traceback (most recent call last): [ 616.353211] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 616.353211] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] self.driver.spawn(context, instance, image_meta, [ 616.353211] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 616.353211] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 616.353211] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 616.353211] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] vm_ref = self.build_virtual_machine(instance, [ 616.353211] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 616.353211] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] vif_infos = vmwarevif.get_vif_info(self._session, [ 616.353211] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 616.353540] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] for vif in network_info: [ 616.353540] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 616.353540] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] return self._sync_wrapper(fn, *args, **kwargs) [ 616.353540] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 616.353540] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] self.wait() [ 616.353540] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 616.353540] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] self[:] = self._gt.wait() [ 616.353540] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 616.353540] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] return self._exit_event.wait() [ 616.353540] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 616.353540] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] result = hub.switch() [ 616.353540] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 616.353540] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] return self.greenlet.switch() [ 616.353540] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 616.353917] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] result = function(*args, **kwargs) [ 616.353917] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 616.353917] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] return func(*args, **kwargs) [ 616.353917] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 616.353917] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] raise e [ 616.353917] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 616.353917] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] nwinfo = self.network_api.allocate_for_instance( [ 616.353917] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 616.353917] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] created_port_ids = self._update_ports_for_instance( [ 616.353917] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 616.353917] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] with excutils.save_and_reraise_exception(): [ 616.353917] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 616.353917] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] self.force_reraise() [ 616.354281] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 616.354281] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] raise self.value [ 616.354281] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 616.354281] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] updated_port = self._update_port( [ 616.354281] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 616.354281] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] _ensure_no_port_binding_failure(port) [ 616.354281] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 616.354281] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] raise exception.PortBindingFailed(port_id=port['id']) [ 616.354281] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] nova.exception.PortBindingFailed: Binding failed for port 30aab2fb-c3eb-4490-85a0-503a72da63d2, please check neutron logs for more information. [ 616.354281] env[59620]: ERROR nova.compute.manager [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] [ 616.354281] env[59620]: DEBUG nova.compute.utils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Binding failed for port 30aab2fb-c3eb-4490-85a0-503a72da63d2, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 616.356301] env[59620]: DEBUG nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Build of instance 89af0723-c7fc-4d6f-90f5-6f69e7a3630b was re-scheduled: Binding failed for port 30aab2fb-c3eb-4490-85a0-503a72da63d2, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 616.356496] env[59620]: DEBUG nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 616.357013] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Acquiring lock "refresh_cache-89af0723-c7fc-4d6f-90f5-6f69e7a3630b" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 616.357176] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Acquired lock "refresh_cache-89af0723-c7fc-4d6f-90f5-6f69e7a3630b" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 616.357552] env[59620]: DEBUG nova.network.neutron [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 616.375972] env[59620]: DEBUG nova.policy [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '590bc22d21164771b6472358c1c3bfad', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '359089d39b2f4c95960a6768f9b4d1e6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 616.385419] env[59620]: DEBUG nova.network.neutron [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Successfully created port: 511cb5ac-4803-49ba-bedd-e40113b843bd {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 616.419969] env[59620]: DEBUG nova.network.neutron [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 616.955745] env[59620]: DEBUG nova.network.neutron [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 616.970190] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Releasing lock "refresh_cache-89af0723-c7fc-4d6f-90f5-6f69e7a3630b" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 616.970328] env[59620]: DEBUG nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 616.970500] env[59620]: DEBUG nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 616.970668] env[59620]: DEBUG nova.network.neutron [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 617.064263] env[59620]: DEBUG nova.network.neutron [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 617.073680] env[59620]: DEBUG nova.network.neutron [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 617.086530] env[59620]: INFO nova.compute.manager [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] [instance: 89af0723-c7fc-4d6f-90f5-6f69e7a3630b] Took 0.12 seconds to deallocate network for instance. [ 617.202248] env[59620]: INFO nova.scheduler.client.report [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Deleted allocations for instance 89af0723-c7fc-4d6f-90f5-6f69e7a3630b [ 617.236335] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d2a22dc1-98c4-4ab3-93b3-5754572c7da3 tempest-ImagesNegativeTestJSON-2067695245 tempest-ImagesNegativeTestJSON-2067695245-project-member] Lock "89af0723-c7fc-4d6f-90f5-6f69e7a3630b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.843s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.310051] env[59620]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "07eb4258-4513-45f4-9789-0b362028abd7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.310051] env[59620]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "07eb4258-4513-45f4-9789-0b362028abd7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.330449] env[59620]: DEBUG nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 617.406174] env[59620]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.406489] env[59620]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.408400] env[59620]: INFO nova.compute.claims [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 617.613037] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6451e2a0-d96b-4211-8908-517e4c74d3a2 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.625737] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b81ccf01-ec94-48e2-9116-0380b2b7d937 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.658550] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d704e7d1-0d5b-4807-8cfa-7c656e8c9911 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.666828] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d96ade62-7302-4dcb-a2dd-14d2bb791e36 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.682501] env[59620]: DEBUG nova.compute.provider_tree [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 617.700590] env[59620]: DEBUG nova.scheduler.client.report [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 617.722857] env[59620]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.316s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.723388] env[59620]: DEBUG nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 617.764975] env[59620]: DEBUG nova.compute.utils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 617.766586] env[59620]: DEBUG nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 617.766756] env[59620]: DEBUG nova.network.neutron [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 617.781061] env[59620]: DEBUG nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 617.871847] env[59620]: DEBUG nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 617.899877] env[59620]: DEBUG nova.virt.hardware [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 617.900136] env[59620]: DEBUG nova.virt.hardware [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 617.900291] env[59620]: DEBUG nova.virt.hardware [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 617.900466] env[59620]: DEBUG nova.virt.hardware [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 617.900605] env[59620]: DEBUG nova.virt.hardware [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 617.900745] env[59620]: DEBUG nova.virt.hardware [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 617.900944] env[59620]: DEBUG nova.virt.hardware [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 617.901108] env[59620]: DEBUG nova.virt.hardware [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 617.901640] env[59620]: DEBUG nova.virt.hardware [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 617.901712] env[59620]: DEBUG nova.virt.hardware [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 617.901889] env[59620]: DEBUG nova.virt.hardware [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 617.902753] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52df2f7d-e06f-4046-8217-85d83fa9618a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.913157] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15a84de7-bfbe-4899-b638-f8d3b5f15c9f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.980915] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 617.980915] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 617.981194] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Starting heal instance info cache {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 617.981229] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Rebuilding the list of instances to heal {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 617.998323] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Skipping network cache update for instance because it is Building. {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 617.998323] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Skipping network cache update for instance because it is Building. {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 617.998323] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Skipping network cache update for instance because it is Building. {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 617.998446] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] [instance: 05155891-6002-4ac0-8386-62e8db523152] Skipping network cache update for instance because it is Building. {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 617.999010] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Skipping network cache update for instance because it is Building. {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 617.999010] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Skipping network cache update for instance because it is Building. {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 617.999010] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Didn't find any instances for network info cache update. {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 618.002527] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.005359] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.005359] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.005359] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.005359] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.005359] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.005359] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59620) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 618.005888] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager.update_available_resource {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.020581] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.020816] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.022023] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.022023] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59620) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 618.022471] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3fa11f2-d83d-4462-9a92-36909554f646 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.034313] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-314fd51a-52c5-4d1a-8451-58118aefb76b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.054757] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22befbe2-fae5-4130-88a8-42b4f8f6a467 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.060325] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ed9fb4c-527b-4a9c-874c-10cb1a50e675 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.091605] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181492MB free_disk=136GB free_vcpus=48 pci_devices=None {{(pid=59620) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 618.091770] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.091961] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.156354] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Instance 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59620) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 618.156538] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Instance 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59620) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 618.156718] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Instance 757b0e86-7d50-46c8-b69a-7e729d925cb1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59620) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 618.156843] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Instance 05155891-6002-4ac0-8386-62e8db523152 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59620) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 618.156969] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Instance 4a43bc91-94d5-46b4-8e29-e8a02d98249f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59620) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 618.157104] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Instance 07eb4258-4513-45f4-9789-0b362028abd7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59620) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 618.157288] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=59620) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 618.157584] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=59620) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 618.185134] env[59620]: DEBUG nova.policy [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9ca548ffb6fe4cd9a91d5a743fd851cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1c65b725b7474926832d3ffd92af67db', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 618.301436] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1fa9d44-a802-464e-accd-f3f637178c2c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.310990] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9acd66b9-8579-4f75-9e03-1018e6767468 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.342145] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-996bf562-6aa8-446d-9568-3975140dd45e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.350012] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4ff7abc-e776-4bca-9f47-051d3db07f42 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.365638] env[59620]: DEBUG nova.compute.provider_tree [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 618.375212] env[59620]: DEBUG nova.scheduler.client.report [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 618.390303] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59620) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 618.390303] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.298s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.489615] env[59620]: DEBUG nova.network.neutron [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Successfully created port: b336f827-0ec0-43eb-9a05-24c3fb7bd880 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 620.253252] env[59620]: DEBUG nova.network.neutron [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Successfully created port: 7289eb8f-c182-468e-ac2a-4fd18c65208d {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 620.683148] env[59620]: ERROR nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 438a9f07-b733-41d5-a82a-a560eeadf95c, please check neutron logs for more information. [ 620.683148] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 620.683148] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 620.683148] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 620.683148] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 620.683148] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 620.683148] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 620.683148] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 620.683148] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 620.683148] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 620.683148] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 620.683148] env[59620]: ERROR nova.compute.manager raise self.value [ 620.683148] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 620.683148] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 620.683148] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 620.683148] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 620.683746] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 620.683746] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 620.683746] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 438a9f07-b733-41d5-a82a-a560eeadf95c, please check neutron logs for more information. [ 620.683746] env[59620]: ERROR nova.compute.manager [ 620.683746] env[59620]: Traceback (most recent call last): [ 620.683746] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 620.683746] env[59620]: listener.cb(fileno) [ 620.683746] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 620.683746] env[59620]: result = function(*args, **kwargs) [ 620.683746] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 620.683746] env[59620]: return func(*args, **kwargs) [ 620.683746] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 620.683746] env[59620]: raise e [ 620.683746] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 620.683746] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 620.683746] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 620.683746] env[59620]: created_port_ids = self._update_ports_for_instance( [ 620.683746] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 620.683746] env[59620]: with excutils.save_and_reraise_exception(): [ 620.683746] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 620.683746] env[59620]: self.force_reraise() [ 620.683746] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 620.683746] env[59620]: raise self.value [ 620.683746] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 620.683746] env[59620]: updated_port = self._update_port( [ 620.683746] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 620.683746] env[59620]: _ensure_no_port_binding_failure(port) [ 620.683746] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 620.683746] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 620.684525] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 438a9f07-b733-41d5-a82a-a560eeadf95c, please check neutron logs for more information. [ 620.684525] env[59620]: Removing descriptor: 14 [ 620.684525] env[59620]: ERROR nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 438a9f07-b733-41d5-a82a-a560eeadf95c, please check neutron logs for more information. [ 620.684525] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Traceback (most recent call last): [ 620.684525] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 620.684525] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] yield resources [ 620.684525] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 620.684525] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] self.driver.spawn(context, instance, image_meta, [ 620.684525] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 620.684525] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] self._vmops.spawn(context, instance, image_meta, injected_files, [ 620.684525] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 620.684525] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] vm_ref = self.build_virtual_machine(instance, [ 620.684937] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 620.684937] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] vif_infos = vmwarevif.get_vif_info(self._session, [ 620.684937] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 620.684937] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] for vif in network_info: [ 620.684937] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 620.684937] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] return self._sync_wrapper(fn, *args, **kwargs) [ 620.684937] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 620.684937] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] self.wait() [ 620.684937] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 620.684937] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] self[:] = self._gt.wait() [ 620.684937] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 620.684937] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] return self._exit_event.wait() [ 620.684937] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 620.684937] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] result = hub.switch() [ 620.685335] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 620.685335] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] return self.greenlet.switch() [ 620.685335] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 620.685335] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] result = function(*args, **kwargs) [ 620.685335] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 620.685335] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] return func(*args, **kwargs) [ 620.685335] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 620.685335] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] raise e [ 620.685335] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 620.685335] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] nwinfo = self.network_api.allocate_for_instance( [ 620.685335] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 620.685335] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] created_port_ids = self._update_ports_for_instance( [ 620.685335] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 620.686594] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] with excutils.save_and_reraise_exception(): [ 620.686594] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 620.686594] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] self.force_reraise() [ 620.686594] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 620.686594] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] raise self.value [ 620.686594] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 620.686594] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] updated_port = self._update_port( [ 620.686594] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 620.686594] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] _ensure_no_port_binding_failure(port) [ 620.686594] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 620.686594] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] raise exception.PortBindingFailed(port_id=port['id']) [ 620.686594] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] nova.exception.PortBindingFailed: Binding failed for port 438a9f07-b733-41d5-a82a-a560eeadf95c, please check neutron logs for more information. [ 620.686594] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] [ 620.686973] env[59620]: INFO nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Terminating instance [ 620.686973] env[59620]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Acquiring lock "refresh_cache-80421e87-c5bb-4eae-acd0-fa2ce12d8bbe" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 620.687077] env[59620]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Acquired lock "refresh_cache-80421e87-c5bb-4eae-acd0-fa2ce12d8bbe" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 620.687884] env[59620]: DEBUG nova.network.neutron [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 620.806112] env[59620]: DEBUG nova.network.neutron [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 621.271923] env[59620]: DEBUG nova.compute.manager [req-e59ac89f-8a38-4eb5-a46e-bc7bf6da3b6d req-a4378c49-e57d-4253-b86e-37f594185518 service nova] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Received event network-changed-438a9f07-b733-41d5-a82a-a560eeadf95c {{(pid=59620) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 621.272109] env[59620]: DEBUG nova.compute.manager [req-e59ac89f-8a38-4eb5-a46e-bc7bf6da3b6d req-a4378c49-e57d-4253-b86e-37f594185518 service nova] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Refreshing instance network info cache due to event network-changed-438a9f07-b733-41d5-a82a-a560eeadf95c. {{(pid=59620) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 621.272188] env[59620]: DEBUG oslo_concurrency.lockutils [req-e59ac89f-8a38-4eb5-a46e-bc7bf6da3b6d req-a4378c49-e57d-4253-b86e-37f594185518 service nova] Acquiring lock "refresh_cache-80421e87-c5bb-4eae-acd0-fa2ce12d8bbe" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 621.489419] env[59620]: DEBUG nova.network.neutron [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 621.504509] env[59620]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Releasing lock "refresh_cache-80421e87-c5bb-4eae-acd0-fa2ce12d8bbe" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 621.504960] env[59620]: DEBUG nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 621.505084] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 621.505389] env[59620]: DEBUG oslo_concurrency.lockutils [req-e59ac89f-8a38-4eb5-a46e-bc7bf6da3b6d req-a4378c49-e57d-4253-b86e-37f594185518 service nova] Acquired lock "refresh_cache-80421e87-c5bb-4eae-acd0-fa2ce12d8bbe" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 621.505551] env[59620]: DEBUG nova.network.neutron [req-e59ac89f-8a38-4eb5-a46e-bc7bf6da3b6d req-a4378c49-e57d-4253-b86e-37f594185518 service nova] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Refreshing network info cache for port 438a9f07-b733-41d5-a82a-a560eeadf95c {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 621.506667] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c7434008-f5b2-4cdc-87bd-f42013c77dae {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.520168] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fdd595a-a182-4187-aecb-55ffeb56c453 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.549713] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe could not be found. [ 621.550372] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 621.550710] env[59620]: INFO nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Took 0.05 seconds to destroy the instance on the hypervisor. [ 621.550961] env[59620]: DEBUG oslo.service.loopingcall [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 621.551764] env[59620]: DEBUG nova.compute.manager [-] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 621.551764] env[59620]: DEBUG nova.network.neutron [-] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 621.598467] env[59620]: DEBUG nova.network.neutron [req-e59ac89f-8a38-4eb5-a46e-bc7bf6da3b6d req-a4378c49-e57d-4253-b86e-37f594185518 service nova] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 621.862147] env[59620]: DEBUG nova.network.neutron [-] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 621.870149] env[59620]: DEBUG nova.network.neutron [-] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 621.884611] env[59620]: INFO nova.compute.manager [-] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Took 0.33 seconds to deallocate network for instance. [ 621.887808] env[59620]: DEBUG nova.compute.claims [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 621.887999] env[59620]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 621.888245] env[59620]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 621.963175] env[59620]: ERROR nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port b10348ce-ec95-49ff-bce7-29c566656dd9, please check neutron logs for more information. [ 621.963175] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 621.963175] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 621.963175] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 621.963175] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 621.963175] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 621.963175] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 621.963175] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 621.963175] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 621.963175] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 621.963175] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 621.963175] env[59620]: ERROR nova.compute.manager raise self.value [ 621.963175] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 621.963175] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 621.963175] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 621.963175] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 621.963725] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 621.963725] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 621.963725] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port b10348ce-ec95-49ff-bce7-29c566656dd9, please check neutron logs for more information. [ 621.963725] env[59620]: ERROR nova.compute.manager [ 621.963725] env[59620]: Traceback (most recent call last): [ 621.963725] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 621.963725] env[59620]: listener.cb(fileno) [ 621.963725] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 621.963725] env[59620]: result = function(*args, **kwargs) [ 621.963725] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 621.963725] env[59620]: return func(*args, **kwargs) [ 621.963725] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 621.963725] env[59620]: raise e [ 621.963725] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 621.963725] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 621.963725] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 621.963725] env[59620]: created_port_ids = self._update_ports_for_instance( [ 621.963725] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 621.963725] env[59620]: with excutils.save_and_reraise_exception(): [ 621.963725] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 621.963725] env[59620]: self.force_reraise() [ 621.963725] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 621.963725] env[59620]: raise self.value [ 621.963725] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 621.963725] env[59620]: updated_port = self._update_port( [ 621.963725] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 621.963725] env[59620]: _ensure_no_port_binding_failure(port) [ 621.963725] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 621.963725] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 621.964421] env[59620]: nova.exception.PortBindingFailed: Binding failed for port b10348ce-ec95-49ff-bce7-29c566656dd9, please check neutron logs for more information. [ 621.964421] env[59620]: Removing descriptor: 15 [ 621.964421] env[59620]: ERROR nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port b10348ce-ec95-49ff-bce7-29c566656dd9, please check neutron logs for more information. [ 621.964421] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Traceback (most recent call last): [ 621.964421] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 621.964421] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] yield resources [ 621.964421] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 621.964421] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] self.driver.spawn(context, instance, image_meta, [ 621.964421] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 621.964421] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 621.964421] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 621.964421] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] vm_ref = self.build_virtual_machine(instance, [ 621.965017] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 621.965017] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] vif_infos = vmwarevif.get_vif_info(self._session, [ 621.965017] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 621.965017] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] for vif in network_info: [ 621.965017] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 621.965017] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] return self._sync_wrapper(fn, *args, **kwargs) [ 621.965017] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 621.965017] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] self.wait() [ 621.965017] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 621.965017] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] self[:] = self._gt.wait() [ 621.965017] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 621.965017] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] return self._exit_event.wait() [ 621.965017] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 621.965017] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] result = hub.switch() [ 621.965438] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 621.965438] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] return self.greenlet.switch() [ 621.965438] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 621.965438] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] result = function(*args, **kwargs) [ 621.965438] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 621.965438] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] return func(*args, **kwargs) [ 621.965438] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 621.965438] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] raise e [ 621.965438] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 621.965438] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] nwinfo = self.network_api.allocate_for_instance( [ 621.965438] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 621.965438] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] created_port_ids = self._update_ports_for_instance( [ 621.965438] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 621.965796] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] with excutils.save_and_reraise_exception(): [ 621.965796] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 621.965796] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] self.force_reraise() [ 621.965796] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 621.965796] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] raise self.value [ 621.965796] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 621.965796] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] updated_port = self._update_port( [ 621.965796] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 621.965796] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] _ensure_no_port_binding_failure(port) [ 621.965796] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 621.965796] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] raise exception.PortBindingFailed(port_id=port['id']) [ 621.965796] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] nova.exception.PortBindingFailed: Binding failed for port b10348ce-ec95-49ff-bce7-29c566656dd9, please check neutron logs for more information. [ 621.965796] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] [ 621.966152] env[59620]: INFO nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Terminating instance [ 621.968989] env[59620]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "refresh_cache-757b0e86-7d50-46c8-b69a-7e729d925cb1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 621.968989] env[59620]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquired lock "refresh_cache-757b0e86-7d50-46c8-b69a-7e729d925cb1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 621.968989] env[59620]: DEBUG nova.network.neutron [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 622.064430] env[59620]: DEBUG nova.network.neutron [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 622.133161] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dd557b7-c6db-46cb-a5b6-26016c9b1b3e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.142450] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9a6af51-8e0a-4d84-a1ae-c6600b780bdd {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.184907] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9528924-c8cf-4704-a747-20c6c755466e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.193287] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b30d232d-9c55-4e71-a51d-521029bfac96 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.212517] env[59620]: DEBUG nova.compute.provider_tree [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 622.221384] env[59620]: DEBUG nova.scheduler.client.report [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 622.237598] env[59620]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.349s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.238544] env[59620]: ERROR nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 438a9f07-b733-41d5-a82a-a560eeadf95c, please check neutron logs for more information. [ 622.238544] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Traceback (most recent call last): [ 622.238544] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 622.238544] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] self.driver.spawn(context, instance, image_meta, [ 622.238544] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 622.238544] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] self._vmops.spawn(context, instance, image_meta, injected_files, [ 622.238544] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 622.238544] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] vm_ref = self.build_virtual_machine(instance, [ 622.238544] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 622.238544] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] vif_infos = vmwarevif.get_vif_info(self._session, [ 622.238544] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 622.238949] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] for vif in network_info: [ 622.238949] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 622.238949] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] return self._sync_wrapper(fn, *args, **kwargs) [ 622.238949] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 622.238949] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] self.wait() [ 622.238949] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 622.238949] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] self[:] = self._gt.wait() [ 622.238949] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 622.238949] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] return self._exit_event.wait() [ 622.238949] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 622.238949] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] result = hub.switch() [ 622.238949] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 622.238949] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] return self.greenlet.switch() [ 622.238949] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 622.239983] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] result = function(*args, **kwargs) [ 622.239983] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 622.239983] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] return func(*args, **kwargs) [ 622.239983] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 622.239983] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] raise e [ 622.239983] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 622.239983] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] nwinfo = self.network_api.allocate_for_instance( [ 622.239983] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 622.239983] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] created_port_ids = self._update_ports_for_instance( [ 622.239983] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 622.239983] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] with excutils.save_and_reraise_exception(): [ 622.239983] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 622.239983] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] self.force_reraise() [ 622.240360] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 622.240360] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] raise self.value [ 622.240360] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 622.240360] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] updated_port = self._update_port( [ 622.240360] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 622.240360] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] _ensure_no_port_binding_failure(port) [ 622.240360] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 622.240360] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] raise exception.PortBindingFailed(port_id=port['id']) [ 622.240360] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] nova.exception.PortBindingFailed: Binding failed for port 438a9f07-b733-41d5-a82a-a560eeadf95c, please check neutron logs for more information. [ 622.240360] env[59620]: ERROR nova.compute.manager [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] [ 622.240360] env[59620]: DEBUG nova.compute.utils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Binding failed for port 438a9f07-b733-41d5-a82a-a560eeadf95c, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 622.241685] env[59620]: DEBUG nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Build of instance 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe was re-scheduled: Binding failed for port 438a9f07-b733-41d5-a82a-a560eeadf95c, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 622.242128] env[59620]: DEBUG nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 622.242381] env[59620]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Acquiring lock "refresh_cache-80421e87-c5bb-4eae-acd0-fa2ce12d8bbe" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 622.415379] env[59620]: DEBUG nova.network.neutron [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 622.426919] env[59620]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Releasing lock "refresh_cache-757b0e86-7d50-46c8-b69a-7e729d925cb1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 622.427585] env[59620]: DEBUG nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 622.427585] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 622.428456] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-abe84b9f-a4f5-4bfd-8c5f-5bc1c0e574df {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.442740] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a4d850f-9b0d-454a-9267-be7ca1e78e55 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.471262] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 757b0e86-7d50-46c8-b69a-7e729d925cb1 could not be found. [ 622.471262] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 622.471262] env[59620]: INFO nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Took 0.04 seconds to destroy the instance on the hypervisor. [ 622.471262] env[59620]: DEBUG oslo.service.loopingcall [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 622.471262] env[59620]: DEBUG nova.compute.manager [-] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 622.471708] env[59620]: DEBUG nova.network.neutron [-] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 622.651338] env[59620]: DEBUG nova.network.neutron [req-e59ac89f-8a38-4eb5-a46e-bc7bf6da3b6d req-a4378c49-e57d-4253-b86e-37f594185518 service nova] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 622.661262] env[59620]: DEBUG nova.network.neutron [-] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 622.668897] env[59620]: DEBUG nova.network.neutron [-] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 622.678171] env[59620]: DEBUG oslo_concurrency.lockutils [req-e59ac89f-8a38-4eb5-a46e-bc7bf6da3b6d req-a4378c49-e57d-4253-b86e-37f594185518 service nova] Releasing lock "refresh_cache-80421e87-c5bb-4eae-acd0-fa2ce12d8bbe" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 622.679095] env[59620]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Acquired lock "refresh_cache-80421e87-c5bb-4eae-acd0-fa2ce12d8bbe" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 622.679253] env[59620]: DEBUG nova.network.neutron [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 622.686450] env[59620]: INFO nova.compute.manager [-] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Took 0.22 seconds to deallocate network for instance. [ 622.697514] env[59620]: DEBUG nova.compute.claims [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 622.697701] env[59620]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.698650] env[59620]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.854678] env[59620]: DEBUG nova.network.neutron [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 622.866496] env[59620]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Acquiring lock "4fd28c4b-e5df-475b-bb3d-f163c9f5b436" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.866496] env[59620]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "4fd28c4b-e5df-475b-bb3d-f163c9f5b436" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.875711] env[59620]: DEBUG nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 622.885189] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b442d1f8-7064-4dd9-95b5-6eae277fa2a5 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.899413] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8b1ae58-1d85-46bd-b9ba-d08e5bd90f43 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.935878] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dcb9838-c012-4448-994a-8fae2d052583 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.946043] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6e9133a-d794-48b2-a6e8-7810907975ef {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.951773] env[59620]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.961693] env[59620]: DEBUG nova.compute.provider_tree [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 622.974716] env[59620]: DEBUG nova.scheduler.client.report [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 622.993570] env[59620]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.295s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.994502] env[59620]: ERROR nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port b10348ce-ec95-49ff-bce7-29c566656dd9, please check neutron logs for more information. [ 622.994502] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Traceback (most recent call last): [ 622.994502] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 622.994502] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] self.driver.spawn(context, instance, image_meta, [ 622.994502] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 622.994502] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 622.994502] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 622.994502] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] vm_ref = self.build_virtual_machine(instance, [ 622.994502] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 622.994502] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] vif_infos = vmwarevif.get_vif_info(self._session, [ 622.994502] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 622.994865] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] for vif in network_info: [ 622.994865] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 622.994865] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] return self._sync_wrapper(fn, *args, **kwargs) [ 622.994865] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 622.994865] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] self.wait() [ 622.994865] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 622.994865] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] self[:] = self._gt.wait() [ 622.994865] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 622.994865] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] return self._exit_event.wait() [ 622.994865] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 622.994865] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] result = hub.switch() [ 622.994865] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 622.994865] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] return self.greenlet.switch() [ 622.994865] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 622.995287] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] result = function(*args, **kwargs) [ 622.995287] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 622.995287] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] return func(*args, **kwargs) [ 622.995287] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 622.995287] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] raise e [ 622.995287] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 622.995287] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] nwinfo = self.network_api.allocate_for_instance( [ 622.995287] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 622.995287] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] created_port_ids = self._update_ports_for_instance( [ 622.995287] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 622.995287] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] with excutils.save_and_reraise_exception(): [ 622.995287] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 622.995287] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] self.force_reraise() [ 622.995620] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 622.995620] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] raise self.value [ 622.995620] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 622.995620] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] updated_port = self._update_port( [ 622.995620] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 622.995620] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] _ensure_no_port_binding_failure(port) [ 622.995620] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 622.995620] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] raise exception.PortBindingFailed(port_id=port['id']) [ 622.995620] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] nova.exception.PortBindingFailed: Binding failed for port b10348ce-ec95-49ff-bce7-29c566656dd9, please check neutron logs for more information. [ 622.995620] env[59620]: ERROR nova.compute.manager [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] [ 622.995620] env[59620]: DEBUG nova.compute.utils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Binding failed for port b10348ce-ec95-49ff-bce7-29c566656dd9, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 622.996088] env[59620]: DEBUG nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Build of instance 757b0e86-7d50-46c8-b69a-7e729d925cb1 was re-scheduled: Binding failed for port b10348ce-ec95-49ff-bce7-29c566656dd9, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 622.996505] env[59620]: DEBUG nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 622.996735] env[59620]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquiring lock "refresh_cache-757b0e86-7d50-46c8-b69a-7e729d925cb1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 622.996879] env[59620]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Acquired lock "refresh_cache-757b0e86-7d50-46c8-b69a-7e729d925cb1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 622.997040] env[59620]: DEBUG nova.network.neutron [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 623.001019] env[59620]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.047s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.001019] env[59620]: INFO nova.compute.claims [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 623.159606] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e2862f1-2562-45cd-beb0-413500f5bc12 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.169849] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36959dba-8f2b-44a3-b8b6-29361fb60434 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.177418] env[59620]: DEBUG nova.network.neutron [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 623.206724] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46cfeff1-bdc5-4cb4-9272-2758cf187227 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.215857] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70645a15-7c79-42f1-8f37-d0a76552dec7 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.233658] env[59620]: DEBUG nova.compute.provider_tree [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 623.268873] env[59620]: DEBUG nova.scheduler.client.report [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 623.298797] env[59620]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.300s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.299682] env[59620]: DEBUG nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 623.346698] env[59620]: DEBUG nova.compute.utils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 623.348588] env[59620]: DEBUG nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 623.349119] env[59620]: DEBUG nova.network.neutron [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 623.362083] env[59620]: DEBUG nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 623.444311] env[59620]: DEBUG nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 623.471549] env[59620]: DEBUG nova.virt.hardware [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 623.471781] env[59620]: DEBUG nova.virt.hardware [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 623.471912] env[59620]: DEBUG nova.virt.hardware [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 623.473149] env[59620]: DEBUG nova.virt.hardware [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 623.473149] env[59620]: DEBUG nova.virt.hardware [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 623.473149] env[59620]: DEBUG nova.virt.hardware [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 623.473149] env[59620]: DEBUG nova.virt.hardware [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 623.473149] env[59620]: DEBUG nova.virt.hardware [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 623.473382] env[59620]: DEBUG nova.virt.hardware [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 623.473382] env[59620]: DEBUG nova.virt.hardware [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 623.473382] env[59620]: DEBUG nova.virt.hardware [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 623.474112] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0e04241-95bb-4f22-bb4d-fe4b76f0d345 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.482713] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0bc9258-88b7-48f5-8c7a-70d24c528c55 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.579187] env[59620]: DEBUG nova.network.neutron [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 623.593461] env[59620]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Releasing lock "refresh_cache-757b0e86-7d50-46c8-b69a-7e729d925cb1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 623.593681] env[59620]: DEBUG nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 623.593837] env[59620]: DEBUG nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 623.593990] env[59620]: DEBUG nova.network.neutron [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 623.688141] env[59620]: DEBUG nova.network.neutron [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 623.697771] env[59620]: DEBUG nova.policy [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7edc0db7734849eebeb3768cf13f3408', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bd2bc2d1b262440f90066cdcb1bfc630', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 623.704927] env[59620]: DEBUG nova.network.neutron [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 623.716237] env[59620]: INFO nova.compute.manager [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] [instance: 757b0e86-7d50-46c8-b69a-7e729d925cb1] Took 0.12 seconds to deallocate network for instance. [ 623.740714] env[59620]: DEBUG nova.network.neutron [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 623.755342] env[59620]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Releasing lock "refresh_cache-80421e87-c5bb-4eae-acd0-fa2ce12d8bbe" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 623.755637] env[59620]: DEBUG nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 623.755820] env[59620]: DEBUG nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 623.756014] env[59620]: DEBUG nova.network.neutron [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 623.824944] env[59620]: DEBUG nova.network.neutron [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 623.831685] env[59620]: INFO nova.scheduler.client.report [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Deleted allocations for instance 757b0e86-7d50-46c8-b69a-7e729d925cb1 [ 623.837583] env[59620]: DEBUG nova.network.neutron [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 623.853728] env[59620]: INFO nova.compute.manager [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] [instance: 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe] Took 0.10 seconds to deallocate network for instance. [ 623.861279] env[59620]: DEBUG oslo_concurrency.lockutils [None req-f8664e5e-f990-473f-a69f-a47a380824c9 tempest-DeleteServersAdminTestJSON-365478894 tempest-DeleteServersAdminTestJSON-365478894-project-member] Lock "757b0e86-7d50-46c8-b69a-7e729d925cb1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.272s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.963296] env[59620]: INFO nova.scheduler.client.report [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Deleted allocations for instance 80421e87-c5bb-4eae-acd0-fa2ce12d8bbe [ 623.981760] env[59620]: DEBUG oslo_concurrency.lockutils [None req-51932cd2-5a2f-4ddc-aa98-e8743ead86b5 tempest-ImagesOneServerTestJSON-836620002 tempest-ImagesOneServerTestJSON-836620002-project-member] Lock "80421e87-c5bb-4eae-acd0-fa2ce12d8bbe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.506s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 624.171803] env[59620]: ERROR nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 511cb5ac-4803-49ba-bedd-e40113b843bd, please check neutron logs for more information. [ 624.171803] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 624.171803] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 624.171803] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 624.171803] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 624.171803] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 624.171803] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 624.171803] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 624.171803] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 624.171803] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 624.171803] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 624.171803] env[59620]: ERROR nova.compute.manager raise self.value [ 624.171803] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 624.171803] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 624.171803] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 624.171803] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 624.172251] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 624.172251] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 624.172251] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 511cb5ac-4803-49ba-bedd-e40113b843bd, please check neutron logs for more information. [ 624.172251] env[59620]: ERROR nova.compute.manager [ 624.172251] env[59620]: Traceback (most recent call last): [ 624.172251] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 624.172251] env[59620]: listener.cb(fileno) [ 624.172251] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 624.172251] env[59620]: result = function(*args, **kwargs) [ 624.172251] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 624.172251] env[59620]: return func(*args, **kwargs) [ 624.172251] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 624.172251] env[59620]: raise e [ 624.172251] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 624.172251] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 624.172251] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 624.172251] env[59620]: created_port_ids = self._update_ports_for_instance( [ 624.172251] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 624.172251] env[59620]: with excutils.save_and_reraise_exception(): [ 624.172251] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 624.172251] env[59620]: self.force_reraise() [ 624.172251] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 624.172251] env[59620]: raise self.value [ 624.172251] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 624.172251] env[59620]: updated_port = self._update_port( [ 624.172251] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 624.172251] env[59620]: _ensure_no_port_binding_failure(port) [ 624.172251] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 624.172251] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 624.172985] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 511cb5ac-4803-49ba-bedd-e40113b843bd, please check neutron logs for more information. [ 624.172985] env[59620]: Removing descriptor: 16 [ 624.172985] env[59620]: ERROR nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 511cb5ac-4803-49ba-bedd-e40113b843bd, please check neutron logs for more information. [ 624.172985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] Traceback (most recent call last): [ 624.172985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 624.172985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] yield resources [ 624.172985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 624.172985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] self.driver.spawn(context, instance, image_meta, [ 624.172985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 624.172985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] self._vmops.spawn(context, instance, image_meta, injected_files, [ 624.172985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 624.172985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] vm_ref = self.build_virtual_machine(instance, [ 624.173297] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 624.173297] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] vif_infos = vmwarevif.get_vif_info(self._session, [ 624.173297] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 624.173297] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] for vif in network_info: [ 624.173297] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 624.173297] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] return self._sync_wrapper(fn, *args, **kwargs) [ 624.173297] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 624.173297] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] self.wait() [ 624.173297] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 624.173297] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] self[:] = self._gt.wait() [ 624.173297] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 624.173297] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] return self._exit_event.wait() [ 624.173297] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 624.173297] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] result = hub.switch() [ 624.173659] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 624.173659] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] return self.greenlet.switch() [ 624.173659] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 624.173659] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] result = function(*args, **kwargs) [ 624.173659] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 624.173659] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] return func(*args, **kwargs) [ 624.173659] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 624.173659] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] raise e [ 624.173659] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 624.173659] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] nwinfo = self.network_api.allocate_for_instance( [ 624.173659] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 624.173659] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] created_port_ids = self._update_ports_for_instance( [ 624.173659] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 624.173985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] with excutils.save_and_reraise_exception(): [ 624.173985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 624.173985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] self.force_reraise() [ 624.173985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 624.173985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] raise self.value [ 624.173985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 624.173985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] updated_port = self._update_port( [ 624.173985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 624.173985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] _ensure_no_port_binding_failure(port) [ 624.173985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 624.173985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] raise exception.PortBindingFailed(port_id=port['id']) [ 624.173985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] nova.exception.PortBindingFailed: Binding failed for port 511cb5ac-4803-49ba-bedd-e40113b843bd, please check neutron logs for more information. [ 624.173985] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] [ 624.174364] env[59620]: INFO nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Terminating instance [ 624.177435] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "refresh_cache-05155891-6002-4ac0-8386-62e8db523152" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 624.177435] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquired lock "refresh_cache-05155891-6002-4ac0-8386-62e8db523152" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 624.177577] env[59620]: DEBUG nova.network.neutron [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 624.235081] env[59620]: DEBUG nova.network.neutron [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 625.005103] env[59620]: DEBUG nova.network.neutron [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 625.019468] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Releasing lock "refresh_cache-05155891-6002-4ac0-8386-62e8db523152" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 625.019872] env[59620]: DEBUG nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 625.020073] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 625.021055] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8f5e695d-8f05-47d7-80b8-f5b17129d8c8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.041514] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-143cb191-f0c8-47f3-806d-c2ad053c4944 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.063609] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 05155891-6002-4ac0-8386-62e8db523152 could not be found. [ 625.063823] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 625.063993] env[59620]: INFO nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Took 0.04 seconds to destroy the instance on the hypervisor. [ 625.065980] env[59620]: DEBUG oslo.service.loopingcall [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 625.066395] env[59620]: DEBUG nova.compute.manager [-] [instance: 05155891-6002-4ac0-8386-62e8db523152] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 625.066395] env[59620]: DEBUG nova.network.neutron [-] [instance: 05155891-6002-4ac0-8386-62e8db523152] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 625.191194] env[59620]: DEBUG nova.network.neutron [-] [instance: 05155891-6002-4ac0-8386-62e8db523152] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 625.204606] env[59620]: DEBUG nova.network.neutron [-] [instance: 05155891-6002-4ac0-8386-62e8db523152] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 625.217619] env[59620]: INFO nova.compute.manager [-] [instance: 05155891-6002-4ac0-8386-62e8db523152] Took 0.15 seconds to deallocate network for instance. [ 625.221428] env[59620]: DEBUG nova.compute.claims [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 625.221577] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.221789] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.319413] env[59620]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Acquiring lock "9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.319750] env[59620]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.333112] env[59620]: DEBUG nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 625.394446] env[59620]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.400464] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5160f6df-4986-4e33-a9d8-a44bf430a52e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.408464] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4609b3d-8d4e-4043-8a28-bfe509dd1f26 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.439521] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a862a12-0efc-42d0-b90f-ade78909146e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.448060] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-017f6cca-bfaa-4f6f-a896-b07f452fef7a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.462412] env[59620]: DEBUG nova.compute.provider_tree [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 625.470516] env[59620]: DEBUG nova.scheduler.client.report [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 625.490354] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.268s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.490987] env[59620]: ERROR nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 511cb5ac-4803-49ba-bedd-e40113b843bd, please check neutron logs for more information. [ 625.490987] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] Traceback (most recent call last): [ 625.490987] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 625.490987] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] self.driver.spawn(context, instance, image_meta, [ 625.490987] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 625.490987] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] self._vmops.spawn(context, instance, image_meta, injected_files, [ 625.490987] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 625.490987] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] vm_ref = self.build_virtual_machine(instance, [ 625.490987] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 625.490987] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] vif_infos = vmwarevif.get_vif_info(self._session, [ 625.490987] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 625.491321] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] for vif in network_info: [ 625.491321] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 625.491321] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] return self._sync_wrapper(fn, *args, **kwargs) [ 625.491321] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 625.491321] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] self.wait() [ 625.491321] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 625.491321] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] self[:] = self._gt.wait() [ 625.491321] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 625.491321] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] return self._exit_event.wait() [ 625.491321] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 625.491321] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] result = hub.switch() [ 625.491321] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 625.491321] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] return self.greenlet.switch() [ 625.491321] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 625.491673] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] result = function(*args, **kwargs) [ 625.491673] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 625.491673] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] return func(*args, **kwargs) [ 625.491673] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 625.491673] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] raise e [ 625.491673] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 625.491673] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] nwinfo = self.network_api.allocate_for_instance( [ 625.491673] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 625.491673] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] created_port_ids = self._update_ports_for_instance( [ 625.491673] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 625.491673] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] with excutils.save_and_reraise_exception(): [ 625.491673] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 625.491673] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] self.force_reraise() [ 625.492160] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 625.492160] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] raise self.value [ 625.492160] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 625.492160] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] updated_port = self._update_port( [ 625.492160] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 625.492160] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] _ensure_no_port_binding_failure(port) [ 625.492160] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 625.492160] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] raise exception.PortBindingFailed(port_id=port['id']) [ 625.492160] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] nova.exception.PortBindingFailed: Binding failed for port 511cb5ac-4803-49ba-bedd-e40113b843bd, please check neutron logs for more information. [ 625.492160] env[59620]: ERROR nova.compute.manager [instance: 05155891-6002-4ac0-8386-62e8db523152] [ 625.492160] env[59620]: DEBUG nova.compute.utils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Binding failed for port 511cb5ac-4803-49ba-bedd-e40113b843bd, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 625.493456] env[59620]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.099s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.495282] env[59620]: INFO nova.compute.claims [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 625.498458] env[59620]: DEBUG nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Build of instance 05155891-6002-4ac0-8386-62e8db523152 was re-scheduled: Binding failed for port 511cb5ac-4803-49ba-bedd-e40113b843bd, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 625.499018] env[59620]: DEBUG nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 625.499273] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "refresh_cache-05155891-6002-4ac0-8386-62e8db523152" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 625.499444] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquired lock "refresh_cache-05155891-6002-4ac0-8386-62e8db523152" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 625.499635] env[59620]: DEBUG nova.network.neutron [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 625.669568] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3facff45-429b-4e0c-8d41-ce52490d3410 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.678935] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b232a0a-730c-4614-a40a-d218db8af33c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.725246] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4914cb4f-9959-4085-9c00-f72a0bd9521e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.734623] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81f9a5a6-2a52-44d4-a7cb-d5ef944933d4 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.751437] env[59620]: DEBUG nova.compute.provider_tree [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 625.761421] env[59620]: DEBUG nova.scheduler.client.report [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 625.781828] env[59620]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.288s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.782252] env[59620]: DEBUG nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 625.817994] env[59620]: DEBUG nova.network.neutron [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 625.824460] env[59620]: DEBUG nova.compute.utils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 625.828077] env[59620]: DEBUG nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 625.828077] env[59620]: DEBUG nova.network.neutron [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 625.840831] env[59620]: DEBUG nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 625.933715] env[59620]: ERROR nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port b336f827-0ec0-43eb-9a05-24c3fb7bd880, please check neutron logs for more information. [ 625.933715] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 625.933715] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 625.933715] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 625.933715] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 625.933715] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 625.933715] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 625.933715] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 625.933715] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 625.933715] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 625.933715] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 625.933715] env[59620]: ERROR nova.compute.manager raise self.value [ 625.933715] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 625.933715] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 625.933715] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 625.933715] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 625.934234] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 625.934234] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 625.934234] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port b336f827-0ec0-43eb-9a05-24c3fb7bd880, please check neutron logs for more information. [ 625.934234] env[59620]: ERROR nova.compute.manager [ 625.934234] env[59620]: Traceback (most recent call last): [ 625.934234] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 625.934234] env[59620]: listener.cb(fileno) [ 625.934234] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 625.934234] env[59620]: result = function(*args, **kwargs) [ 625.934234] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 625.934234] env[59620]: return func(*args, **kwargs) [ 625.934234] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 625.934234] env[59620]: raise e [ 625.934234] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 625.934234] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 625.934234] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 625.934234] env[59620]: created_port_ids = self._update_ports_for_instance( [ 625.934234] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 625.934234] env[59620]: with excutils.save_and_reraise_exception(): [ 625.934234] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 625.934234] env[59620]: self.force_reraise() [ 625.934234] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 625.934234] env[59620]: raise self.value [ 625.934234] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 625.934234] env[59620]: updated_port = self._update_port( [ 625.934234] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 625.934234] env[59620]: _ensure_no_port_binding_failure(port) [ 625.934234] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 625.934234] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 625.935069] env[59620]: nova.exception.PortBindingFailed: Binding failed for port b336f827-0ec0-43eb-9a05-24c3fb7bd880, please check neutron logs for more information. [ 625.935069] env[59620]: Removing descriptor: 11 [ 625.935069] env[59620]: ERROR nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port b336f827-0ec0-43eb-9a05-24c3fb7bd880, please check neutron logs for more information. [ 625.935069] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Traceback (most recent call last): [ 625.935069] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 625.935069] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] yield resources [ 625.935069] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 625.935069] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] self.driver.spawn(context, instance, image_meta, [ 625.935069] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 625.935069] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 625.935069] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 625.935069] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] vm_ref = self.build_virtual_machine(instance, [ 625.935394] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 625.935394] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] vif_infos = vmwarevif.get_vif_info(self._session, [ 625.935394] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 625.935394] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] for vif in network_info: [ 625.935394] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 625.935394] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] return self._sync_wrapper(fn, *args, **kwargs) [ 625.935394] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 625.935394] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] self.wait() [ 625.935394] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 625.935394] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] self[:] = self._gt.wait() [ 625.935394] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 625.935394] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] return self._exit_event.wait() [ 625.935394] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 625.935394] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] result = hub.switch() [ 625.935758] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 625.935758] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] return self.greenlet.switch() [ 625.935758] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 625.935758] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] result = function(*args, **kwargs) [ 625.935758] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 625.935758] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] return func(*args, **kwargs) [ 625.935758] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 625.935758] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] raise e [ 625.935758] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 625.935758] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] nwinfo = self.network_api.allocate_for_instance( [ 625.935758] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 625.935758] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] created_port_ids = self._update_ports_for_instance( [ 625.935758] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 625.936200] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] with excutils.save_and_reraise_exception(): [ 625.936200] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 625.936200] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] self.force_reraise() [ 625.936200] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 625.936200] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] raise self.value [ 625.936200] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 625.936200] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] updated_port = self._update_port( [ 625.936200] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 625.936200] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] _ensure_no_port_binding_failure(port) [ 625.936200] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 625.936200] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] raise exception.PortBindingFailed(port_id=port['id']) [ 625.936200] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] nova.exception.PortBindingFailed: Binding failed for port b336f827-0ec0-43eb-9a05-24c3fb7bd880, please check neutron logs for more information. [ 625.936200] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] [ 625.936556] env[59620]: INFO nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Terminating instance [ 625.940020] env[59620]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Acquiring lock "refresh_cache-4a43bc91-94d5-46b4-8e29-e8a02d98249f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 625.940020] env[59620]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Acquired lock "refresh_cache-4a43bc91-94d5-46b4-8e29-e8a02d98249f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 625.940020] env[59620]: DEBUG nova.network.neutron [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 625.957857] env[59620]: DEBUG nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 625.992662] env[59620]: DEBUG nova.virt.hardware [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 625.993227] env[59620]: DEBUG nova.virt.hardware [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 625.994709] env[59620]: DEBUG nova.virt.hardware [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 625.995083] env[59620]: DEBUG nova.virt.hardware [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 625.995357] env[59620]: DEBUG nova.virt.hardware [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 625.995913] env[59620]: DEBUG nova.virt.hardware [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 625.996703] env[59620]: DEBUG nova.virt.hardware [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 625.996969] env[59620]: DEBUG nova.virt.hardware [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 625.997360] env[59620]: DEBUG nova.virt.hardware [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 625.997641] env[59620]: DEBUG nova.virt.hardware [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 625.997930] env[59620]: DEBUG nova.virt.hardware [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 626.000934] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-280ad283-f495-4dd0-acbd-fdfc6f99a44d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.015753] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-154dabd0-b9b3-4654-afd2-3b8e1163f950 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.044338] env[59620]: DEBUG nova.network.neutron [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 626.304236] env[59620]: DEBUG nova.policy [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efa9362057904c4eadee051c14b92935', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0584adefba3a4307aed8fd8fcca5267f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 626.310342] env[59620]: DEBUG nova.network.neutron [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Successfully created port: 5e08de89-08df-4236-82f9-1588491bdb78 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 626.470833] env[59620]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Acquiring lock "b1a72905-ae94-42e3-8926-0f81cb502942" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.471276] env[59620]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "b1a72905-ae94-42e3-8926-0f81cb502942" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.488268] env[59620]: DEBUG nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 626.564307] env[59620]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.564307] env[59620]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.566496] env[59620]: INFO nova.compute.claims [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 626.671199] env[59620]: DEBUG nova.network.neutron [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 626.697358] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Releasing lock "refresh_cache-05155891-6002-4ac0-8386-62e8db523152" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 626.697358] env[59620]: DEBUG nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 626.697535] env[59620]: DEBUG nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 626.697697] env[59620]: DEBUG nova.network.neutron [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 626.824849] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7251b3d7-8283-4857-9aa8-401611a1b0ab {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.829911] env[59620]: DEBUG nova.network.neutron [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 626.839400] env[59620]: DEBUG nova.network.neutron [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 626.841903] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a48b8c3-ec33-4565-b8b9-4a0e97f8e2fa {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.887258] env[59620]: INFO nova.compute.manager [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 05155891-6002-4ac0-8386-62e8db523152] Took 0.19 seconds to deallocate network for instance. [ 626.891032] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0df6c1bd-2b53-4f30-81f5-4e682f6f9ceb {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.899205] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-736b1e4b-af72-4048-9b4c-2e048ff13499 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.922765] env[59620]: DEBUG nova.compute.provider_tree [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 626.943296] env[59620]: DEBUG nova.scheduler.client.report [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 626.952520] env[59620]: DEBUG nova.network.neutron [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 626.967474] env[59620]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Releasing lock "refresh_cache-4a43bc91-94d5-46b4-8e29-e8a02d98249f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 626.967872] env[59620]: DEBUG nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 626.968059] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 626.972028] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fd542cfb-6714-4b7d-ae26-e45b94640bd3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.973101] env[59620]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.409s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.973583] env[59620]: DEBUG nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 626.983154] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a36042a9-9f9f-45f1-8f47-45002f7ee78a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.013477] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4a43bc91-94d5-46b4-8e29-e8a02d98249f could not be found. [ 627.013477] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 627.015041] env[59620]: INFO nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Took 0.05 seconds to destroy the instance on the hypervisor. [ 627.015041] env[59620]: DEBUG oslo.service.loopingcall [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 627.015041] env[59620]: DEBUG nova.compute.manager [-] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 627.015041] env[59620]: DEBUG nova.network.neutron [-] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 627.028928] env[59620]: DEBUG nova.compute.utils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 627.033064] env[59620]: DEBUG nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 627.033064] env[59620]: DEBUG nova.network.neutron [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 627.040749] env[59620]: INFO nova.scheduler.client.report [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Deleted allocations for instance 05155891-6002-4ac0-8386-62e8db523152 [ 627.050368] env[59620]: DEBUG nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 627.058279] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5791d303-4116-492a-a4ea-80ee8f90a627 tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "05155891-6002-4ac0-8386-62e8db523152" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.295s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.059140] env[59620]: DEBUG nova.network.neutron [-] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 627.074633] env[59620]: DEBUG nova.network.neutron [-] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 627.126869] env[59620]: INFO nova.compute.manager [-] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Took 0.11 seconds to deallocate network for instance. [ 627.130550] env[59620]: DEBUG nova.compute.claims [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 627.130550] env[59620]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 627.130550] env[59620]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.161984] env[59620]: DEBUG nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 627.197043] env[59620]: DEBUG nova.virt.hardware [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 627.201149] env[59620]: DEBUG nova.virt.hardware [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 627.201149] env[59620]: DEBUG nova.virt.hardware [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 627.201149] env[59620]: DEBUG nova.virt.hardware [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 627.201149] env[59620]: DEBUG nova.virt.hardware [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 627.201149] env[59620]: DEBUG nova.virt.hardware [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 627.201970] env[59620]: DEBUG nova.virt.hardware [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 627.201970] env[59620]: DEBUG nova.virt.hardware [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 627.201970] env[59620]: DEBUG nova.virt.hardware [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 627.202622] env[59620]: DEBUG nova.virt.hardware [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 627.202622] env[59620]: DEBUG nova.virt.hardware [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 627.205361] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4014f954-c129-4f24-993d-4bd71692ccf0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.222157] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce9979b4-04b6-4ce7-90d5-47c479c36728 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.334773] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c056002-41c7-468e-bb14-e6a44b896609 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.344826] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bb513a2-b038-42c9-88fc-294bbb32dd34 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.389709] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cb034ac-113c-40d5-a4bc-bb528d13d952 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.398277] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bef9e8f-bd3d-4dca-bd98-0a52eccf36a9 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.415389] env[59620]: DEBUG nova.compute.provider_tree [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 627.417549] env[59620]: ERROR nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 7289eb8f-c182-468e-ac2a-4fd18c65208d, please check neutron logs for more information. [ 627.417549] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 627.417549] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 627.417549] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 627.417549] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 627.417549] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 627.417549] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 627.417549] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 627.417549] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 627.417549] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 627.417549] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 627.417549] env[59620]: ERROR nova.compute.manager raise self.value [ 627.417549] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 627.417549] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 627.417549] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 627.417549] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 627.418090] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 627.418090] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 627.418090] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 7289eb8f-c182-468e-ac2a-4fd18c65208d, please check neutron logs for more information. [ 627.418090] env[59620]: ERROR nova.compute.manager [ 627.418090] env[59620]: Traceback (most recent call last): [ 627.418090] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 627.418090] env[59620]: listener.cb(fileno) [ 627.418090] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 627.418090] env[59620]: result = function(*args, **kwargs) [ 627.418090] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 627.418090] env[59620]: return func(*args, **kwargs) [ 627.418090] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 627.418090] env[59620]: raise e [ 627.418090] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 627.418090] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 627.418090] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 627.418090] env[59620]: created_port_ids = self._update_ports_for_instance( [ 627.418090] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 627.418090] env[59620]: with excutils.save_and_reraise_exception(): [ 627.418090] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 627.418090] env[59620]: self.force_reraise() [ 627.418090] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 627.418090] env[59620]: raise self.value [ 627.418090] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 627.418090] env[59620]: updated_port = self._update_port( [ 627.418090] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 627.418090] env[59620]: _ensure_no_port_binding_failure(port) [ 627.418090] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 627.418090] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 627.419429] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 7289eb8f-c182-468e-ac2a-4fd18c65208d, please check neutron logs for more information. [ 627.419429] env[59620]: Removing descriptor: 18 [ 627.419429] env[59620]: ERROR nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 7289eb8f-c182-468e-ac2a-4fd18c65208d, please check neutron logs for more information. [ 627.419429] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Traceback (most recent call last): [ 627.419429] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 627.419429] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] yield resources [ 627.419429] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 627.419429] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] self.driver.spawn(context, instance, image_meta, [ 627.419429] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 627.419429] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 627.419429] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 627.419429] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] vm_ref = self.build_virtual_machine(instance, [ 627.419859] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 627.419859] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] vif_infos = vmwarevif.get_vif_info(self._session, [ 627.419859] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 627.419859] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] for vif in network_info: [ 627.419859] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 627.419859] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] return self._sync_wrapper(fn, *args, **kwargs) [ 627.419859] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 627.419859] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] self.wait() [ 627.419859] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 627.419859] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] self[:] = self._gt.wait() [ 627.419859] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 627.419859] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] return self._exit_event.wait() [ 627.419859] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 627.419859] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] result = hub.switch() [ 627.423846] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 627.423846] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] return self.greenlet.switch() [ 627.423846] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 627.423846] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] result = function(*args, **kwargs) [ 627.423846] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 627.423846] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] return func(*args, **kwargs) [ 627.423846] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 627.423846] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] raise e [ 627.423846] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 627.423846] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] nwinfo = self.network_api.allocate_for_instance( [ 627.423846] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 627.423846] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] created_port_ids = self._update_ports_for_instance( [ 627.423846] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 627.424262] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] with excutils.save_and_reraise_exception(): [ 627.424262] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 627.424262] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] self.force_reraise() [ 627.424262] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 627.424262] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] raise self.value [ 627.424262] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 627.424262] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] updated_port = self._update_port( [ 627.424262] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 627.424262] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] _ensure_no_port_binding_failure(port) [ 627.424262] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 627.424262] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] raise exception.PortBindingFailed(port_id=port['id']) [ 627.424262] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] nova.exception.PortBindingFailed: Binding failed for port 7289eb8f-c182-468e-ac2a-4fd18c65208d, please check neutron logs for more information. [ 627.424262] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] [ 627.424628] env[59620]: INFO nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Terminating instance [ 627.424628] env[59620]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "refresh_cache-07eb4258-4513-45f4-9789-0b362028abd7" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.424628] env[59620]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquired lock "refresh_cache-07eb4258-4513-45f4-9789-0b362028abd7" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 627.424628] env[59620]: DEBUG nova.network.neutron [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 627.428032] env[59620]: DEBUG nova.scheduler.client.report [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 627.455563] env[59620]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.325s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.456601] env[59620]: ERROR nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port b336f827-0ec0-43eb-9a05-24c3fb7bd880, please check neutron logs for more information. [ 627.456601] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Traceback (most recent call last): [ 627.456601] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 627.456601] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] self.driver.spawn(context, instance, image_meta, [ 627.456601] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 627.456601] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 627.456601] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 627.456601] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] vm_ref = self.build_virtual_machine(instance, [ 627.456601] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 627.456601] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] vif_infos = vmwarevif.get_vif_info(self._session, [ 627.456601] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 627.456957] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] for vif in network_info: [ 627.456957] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 627.456957] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] return self._sync_wrapper(fn, *args, **kwargs) [ 627.456957] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 627.456957] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] self.wait() [ 627.456957] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 627.456957] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] self[:] = self._gt.wait() [ 627.456957] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 627.456957] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] return self._exit_event.wait() [ 627.456957] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 627.456957] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] result = hub.switch() [ 627.456957] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 627.456957] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] return self.greenlet.switch() [ 627.456957] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 627.457305] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] result = function(*args, **kwargs) [ 627.457305] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 627.457305] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] return func(*args, **kwargs) [ 627.457305] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 627.457305] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] raise e [ 627.457305] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 627.457305] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] nwinfo = self.network_api.allocate_for_instance( [ 627.457305] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 627.457305] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] created_port_ids = self._update_ports_for_instance( [ 627.457305] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 627.457305] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] with excutils.save_and_reraise_exception(): [ 627.457305] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 627.457305] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] self.force_reraise() [ 627.457613] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 627.457613] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] raise self.value [ 627.457613] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 627.457613] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] updated_port = self._update_port( [ 627.457613] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 627.457613] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] _ensure_no_port_binding_failure(port) [ 627.457613] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 627.457613] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] raise exception.PortBindingFailed(port_id=port['id']) [ 627.457613] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] nova.exception.PortBindingFailed: Binding failed for port b336f827-0ec0-43eb-9a05-24c3fb7bd880, please check neutron logs for more information. [ 627.457613] env[59620]: ERROR nova.compute.manager [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] [ 627.457613] env[59620]: DEBUG nova.compute.utils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Binding failed for port b336f827-0ec0-43eb-9a05-24c3fb7bd880, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 627.460195] env[59620]: DEBUG nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Build of instance 4a43bc91-94d5-46b4-8e29-e8a02d98249f was re-scheduled: Binding failed for port b336f827-0ec0-43eb-9a05-24c3fb7bd880, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 627.460639] env[59620]: DEBUG nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 627.460850] env[59620]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Acquiring lock "refresh_cache-4a43bc91-94d5-46b4-8e29-e8a02d98249f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.460986] env[59620]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Acquired lock "refresh_cache-4a43bc91-94d5-46b4-8e29-e8a02d98249f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 627.461211] env[59620]: DEBUG nova.network.neutron [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 627.481660] env[59620]: DEBUG nova.network.neutron [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 627.522507] env[59620]: DEBUG nova.network.neutron [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 627.596845] env[59620]: DEBUG nova.policy [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b85992f2684e4986aa987d719b82db40', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd83638802c3948e7a158d4a31c0c904b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 628.074906] env[59620]: DEBUG nova.network.neutron [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 628.085937] env[59620]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Releasing lock "refresh_cache-07eb4258-4513-45f4-9789-0b362028abd7" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 628.085937] env[59620]: DEBUG nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 628.085937] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 628.086128] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ae26fc33-9ba0-40eb-8279-38f97095897b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.099276] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c263915-221f-4be4-b372-8ddf6491fd52 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.116860] env[59620]: DEBUG nova.network.neutron [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 628.133117] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 07eb4258-4513-45f4-9789-0b362028abd7 could not be found. [ 628.133459] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 628.133752] env[59620]: INFO nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Took 0.05 seconds to destroy the instance on the hypervisor. [ 628.133874] env[59620]: DEBUG oslo.service.loopingcall [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 628.139673] env[59620]: DEBUG nova.compute.manager [-] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 628.139673] env[59620]: DEBUG nova.network.neutron [-] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 628.139860] env[59620]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Releasing lock "refresh_cache-4a43bc91-94d5-46b4-8e29-e8a02d98249f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 628.140070] env[59620]: DEBUG nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 628.140662] env[59620]: DEBUG nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 628.140662] env[59620]: DEBUG nova.network.neutron [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 628.191254] env[59620]: DEBUG nova.network.neutron [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 628.208721] env[59620]: DEBUG nova.network.neutron [-] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 628.211339] env[59620]: DEBUG nova.network.neutron [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 628.220380] env[59620]: INFO nova.compute.manager [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] [instance: 4a43bc91-94d5-46b4-8e29-e8a02d98249f] Took 0.08 seconds to deallocate network for instance. [ 628.222674] env[59620]: DEBUG nova.network.neutron [-] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 628.239367] env[59620]: INFO nova.compute.manager [-] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Took 0.10 seconds to deallocate network for instance. [ 628.243988] env[59620]: DEBUG nova.compute.claims [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 628.244857] env[59620]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 628.245483] env[59620]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 628.363742] env[59620]: INFO nova.scheduler.client.report [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Deleted allocations for instance 4a43bc91-94d5-46b4-8e29-e8a02d98249f [ 628.394862] env[59620]: DEBUG oslo_concurrency.lockutils [None req-90ccd2ae-0940-43ea-8ea8-5bde695a566b tempest-ServersTestFqdnHostnames-506908644 tempest-ServersTestFqdnHostnames-506908644-project-member] Lock "4a43bc91-94d5-46b4-8e29-e8a02d98249f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.857s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 628.463956] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a3e936e-1a41-4427-b214-cfa9a9ce4a62 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.472121] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-167c834e-0904-497b-be44-b7888d34f30b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.502709] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b96d62b-066f-4122-a112-0f0ec331a8bc {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.510603] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2cee18c-d967-4de4-a590-1f1a90a96bb4 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.525309] env[59620]: DEBUG nova.compute.provider_tree [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 628.536113] env[59620]: DEBUG nova.scheduler.client.report [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 628.554308] env[59620]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.309s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 628.556013] env[59620]: ERROR nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 7289eb8f-c182-468e-ac2a-4fd18c65208d, please check neutron logs for more information. [ 628.556013] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Traceback (most recent call last): [ 628.556013] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 628.556013] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] self.driver.spawn(context, instance, image_meta, [ 628.556013] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 628.556013] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 628.556013] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 628.556013] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] vm_ref = self.build_virtual_machine(instance, [ 628.556013] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 628.556013] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] vif_infos = vmwarevif.get_vif_info(self._session, [ 628.556013] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 628.556564] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] for vif in network_info: [ 628.556564] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 628.556564] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] return self._sync_wrapper(fn, *args, **kwargs) [ 628.556564] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 628.556564] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] self.wait() [ 628.556564] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 628.556564] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] self[:] = self._gt.wait() [ 628.556564] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 628.556564] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] return self._exit_event.wait() [ 628.556564] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 628.556564] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] result = hub.switch() [ 628.556564] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 628.556564] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] return self.greenlet.switch() [ 628.556564] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 628.556934] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] result = function(*args, **kwargs) [ 628.556934] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 628.556934] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] return func(*args, **kwargs) [ 628.556934] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 628.556934] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] raise e [ 628.556934] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 628.556934] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] nwinfo = self.network_api.allocate_for_instance( [ 628.556934] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 628.556934] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] created_port_ids = self._update_ports_for_instance( [ 628.556934] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 628.556934] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] with excutils.save_and_reraise_exception(): [ 628.556934] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 628.556934] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] self.force_reraise() [ 628.557283] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 628.557283] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] raise self.value [ 628.557283] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 628.557283] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] updated_port = self._update_port( [ 628.557283] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 628.557283] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] _ensure_no_port_binding_failure(port) [ 628.557283] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 628.557283] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] raise exception.PortBindingFailed(port_id=port['id']) [ 628.557283] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] nova.exception.PortBindingFailed: Binding failed for port 7289eb8f-c182-468e-ac2a-4fd18c65208d, please check neutron logs for more information. [ 628.557283] env[59620]: ERROR nova.compute.manager [instance: 07eb4258-4513-45f4-9789-0b362028abd7] [ 628.557283] env[59620]: DEBUG nova.compute.utils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Binding failed for port 7289eb8f-c182-468e-ac2a-4fd18c65208d, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 628.557575] env[59620]: DEBUG nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Build of instance 07eb4258-4513-45f4-9789-0b362028abd7 was re-scheduled: Binding failed for port 7289eb8f-c182-468e-ac2a-4fd18c65208d, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 628.557784] env[59620]: DEBUG nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 628.558200] env[59620]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquiring lock "refresh_cache-07eb4258-4513-45f4-9789-0b362028abd7" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 628.558200] env[59620]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Acquired lock "refresh_cache-07eb4258-4513-45f4-9789-0b362028abd7" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 628.558387] env[59620]: DEBUG nova.network.neutron [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 628.607336] env[59620]: DEBUG nova.network.neutron [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Successfully created port: 9d54dbc2-c248-4bfc-a3d5-70d355561876 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 628.614464] env[59620]: DEBUG nova.network.neutron [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 629.248958] env[59620]: DEBUG nova.network.neutron [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 629.257516] env[59620]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Releasing lock "refresh_cache-07eb4258-4513-45f4-9789-0b362028abd7" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 629.257741] env[59620]: DEBUG nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 629.258017] env[59620]: DEBUG nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 629.258090] env[59620]: DEBUG nova.network.neutron [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 629.319402] env[59620]: DEBUG nova.network.neutron [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 629.328058] env[59620]: DEBUG nova.network.neutron [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 629.336410] env[59620]: INFO nova.compute.manager [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] [instance: 07eb4258-4513-45f4-9789-0b362028abd7] Took 0.08 seconds to deallocate network for instance. [ 629.471147] env[59620]: INFO nova.scheduler.client.report [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Deleted allocations for instance 07eb4258-4513-45f4-9789-0b362028abd7 [ 629.507394] env[59620]: DEBUG oslo_concurrency.lockutils [None req-eaacd95f-bc94-485e-862b-064fab6eda6a tempest-ServersAdminTestJSON-1253349300 tempest-ServersAdminTestJSON-1253349300-project-member] Lock "07eb4258-4513-45f4-9789-0b362028abd7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.196s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 630.183105] env[59620]: DEBUG nova.network.neutron [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Successfully created port: 252581c2-e5f7-471e-bfe2-b357c144e919 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 634.286474] env[59620]: DEBUG nova.compute.manager [req-39e97806-4061-4a8c-aece-87ee68f489d5 req-b68659f3-f692-4459-b087-96b012e91f69 service nova] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Received event network-changed-5e08de89-08df-4236-82f9-1588491bdb78 {{(pid=59620) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 634.286474] env[59620]: DEBUG nova.compute.manager [req-39e97806-4061-4a8c-aece-87ee68f489d5 req-b68659f3-f692-4459-b087-96b012e91f69 service nova] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Refreshing instance network info cache due to event network-changed-5e08de89-08df-4236-82f9-1588491bdb78. {{(pid=59620) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 634.286474] env[59620]: DEBUG oslo_concurrency.lockutils [req-39e97806-4061-4a8c-aece-87ee68f489d5 req-b68659f3-f692-4459-b087-96b012e91f69 service nova] Acquiring lock "refresh_cache-4fd28c4b-e5df-475b-bb3d-f163c9f5b436" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 634.286474] env[59620]: DEBUG oslo_concurrency.lockutils [req-39e97806-4061-4a8c-aece-87ee68f489d5 req-b68659f3-f692-4459-b087-96b012e91f69 service nova] Acquired lock "refresh_cache-4fd28c4b-e5df-475b-bb3d-f163c9f5b436" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 634.286474] env[59620]: DEBUG nova.network.neutron [req-39e97806-4061-4a8c-aece-87ee68f489d5 req-b68659f3-f692-4459-b087-96b012e91f69 service nova] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Refreshing network info cache for port 5e08de89-08df-4236-82f9-1588491bdb78 {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 634.327618] env[59620]: DEBUG nova.network.neutron [req-39e97806-4061-4a8c-aece-87ee68f489d5 req-b68659f3-f692-4459-b087-96b012e91f69 service nova] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 634.435525] env[59620]: DEBUG nova.network.neutron [req-39e97806-4061-4a8c-aece-87ee68f489d5 req-b68659f3-f692-4459-b087-96b012e91f69 service nova] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 634.446082] env[59620]: DEBUG oslo_concurrency.lockutils [req-39e97806-4061-4a8c-aece-87ee68f489d5 req-b68659f3-f692-4459-b087-96b012e91f69 service nova] Releasing lock "refresh_cache-4fd28c4b-e5df-475b-bb3d-f163c9f5b436" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 634.448682] env[59620]: ERROR nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 5e08de89-08df-4236-82f9-1588491bdb78, please check neutron logs for more information. [ 634.448682] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 634.448682] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 634.448682] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 634.448682] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 634.448682] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 634.448682] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 634.448682] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 634.448682] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 634.448682] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 634.448682] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 634.448682] env[59620]: ERROR nova.compute.manager raise self.value [ 634.448682] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 634.448682] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 634.448682] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 634.448682] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 634.450090] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 634.450090] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 634.450090] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 5e08de89-08df-4236-82f9-1588491bdb78, please check neutron logs for more information. [ 634.450090] env[59620]: ERROR nova.compute.manager [ 634.450090] env[59620]: Traceback (most recent call last): [ 634.450090] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 634.450090] env[59620]: listener.cb(fileno) [ 634.450090] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 634.450090] env[59620]: result = function(*args, **kwargs) [ 634.450090] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 634.450090] env[59620]: return func(*args, **kwargs) [ 634.450090] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 634.450090] env[59620]: raise e [ 634.450090] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 634.450090] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 634.450090] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 634.450090] env[59620]: created_port_ids = self._update_ports_for_instance( [ 634.450090] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 634.450090] env[59620]: with excutils.save_and_reraise_exception(): [ 634.450090] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 634.450090] env[59620]: self.force_reraise() [ 634.450090] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 634.450090] env[59620]: raise self.value [ 634.450090] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 634.450090] env[59620]: updated_port = self._update_port( [ 634.450090] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 634.450090] env[59620]: _ensure_no_port_binding_failure(port) [ 634.450090] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 634.450090] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 634.451008] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 5e08de89-08df-4236-82f9-1588491bdb78, please check neutron logs for more information. [ 634.451008] env[59620]: Removing descriptor: 19 [ 634.451008] env[59620]: ERROR nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 5e08de89-08df-4236-82f9-1588491bdb78, please check neutron logs for more information. [ 634.451008] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Traceback (most recent call last): [ 634.451008] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 634.451008] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] yield resources [ 634.451008] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 634.451008] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] self.driver.spawn(context, instance, image_meta, [ 634.451008] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 634.451008] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] self._vmops.spawn(context, instance, image_meta, injected_files, [ 634.451008] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 634.451008] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] vm_ref = self.build_virtual_machine(instance, [ 634.451410] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 634.451410] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] vif_infos = vmwarevif.get_vif_info(self._session, [ 634.451410] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 634.451410] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] for vif in network_info: [ 634.451410] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 634.451410] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] return self._sync_wrapper(fn, *args, **kwargs) [ 634.451410] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 634.451410] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] self.wait() [ 634.451410] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 634.451410] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] self[:] = self._gt.wait() [ 634.451410] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 634.451410] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] return self._exit_event.wait() [ 634.451410] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 634.451410] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] result = hub.switch() [ 634.451821] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 634.451821] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] return self.greenlet.switch() [ 634.451821] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 634.451821] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] result = function(*args, **kwargs) [ 634.451821] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 634.451821] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] return func(*args, **kwargs) [ 634.451821] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 634.451821] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] raise e [ 634.451821] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 634.451821] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] nwinfo = self.network_api.allocate_for_instance( [ 634.451821] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 634.451821] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] created_port_ids = self._update_ports_for_instance( [ 634.451821] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 634.453466] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] with excutils.save_and_reraise_exception(): [ 634.453466] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 634.453466] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] self.force_reraise() [ 634.453466] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 634.453466] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] raise self.value [ 634.453466] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 634.453466] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] updated_port = self._update_port( [ 634.453466] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 634.453466] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] _ensure_no_port_binding_failure(port) [ 634.453466] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 634.453466] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] raise exception.PortBindingFailed(port_id=port['id']) [ 634.453466] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] nova.exception.PortBindingFailed: Binding failed for port 5e08de89-08df-4236-82f9-1588491bdb78, please check neutron logs for more information. [ 634.453466] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] [ 634.453890] env[59620]: INFO nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Terminating instance [ 634.453890] env[59620]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Acquiring lock "refresh_cache-4fd28c4b-e5df-475b-bb3d-f163c9f5b436" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 634.453890] env[59620]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Acquired lock "refresh_cache-4fd28c4b-e5df-475b-bb3d-f163c9f5b436" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 634.453890] env[59620]: DEBUG nova.network.neutron [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 634.740556] env[59620]: DEBUG nova.network.neutron [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 635.179190] env[59620]: DEBUG nova.network.neutron [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.197615] env[59620]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Releasing lock "refresh_cache-4fd28c4b-e5df-475b-bb3d-f163c9f5b436" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 635.198066] env[59620]: DEBUG nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 635.200142] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 635.200697] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b898d360-e9ca-432e-8a91-b4842122612e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.212538] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e175779-c88f-45e0-a6b1-e801baab4ce6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.237826] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4fd28c4b-e5df-475b-bb3d-f163c9f5b436 could not be found. [ 635.238064] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 635.238331] env[59620]: INFO nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Took 0.04 seconds to destroy the instance on the hypervisor. [ 635.238930] env[59620]: DEBUG oslo.service.loopingcall [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 635.239091] env[59620]: DEBUG nova.compute.manager [-] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 635.239642] env[59620]: DEBUG nova.network.neutron [-] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 635.317578] env[59620]: DEBUG nova.network.neutron [-] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 635.326346] env[59620]: DEBUG nova.network.neutron [-] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.337828] env[59620]: INFO nova.compute.manager [-] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Took 0.10 seconds to deallocate network for instance. [ 635.340150] env[59620]: DEBUG nova.compute.claims [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 635.340150] env[59620]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.340383] env[59620]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.458019] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a94da56e-5727-4297-8301-8adc63173ee2 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.466892] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a89cf12-c634-44f2-87b3-d5f0bff0f891 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.497529] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57beb1a0-0e58-4a2f-8c5d-baa5b5b8542b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.505590] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84bf4809-5510-4ef4-b17e-a0af1ea04f10 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.519458] env[59620]: DEBUG nova.compute.provider_tree [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 635.528409] env[59620]: DEBUG nova.scheduler.client.report [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 635.544418] env[59620]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.204s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.545083] env[59620]: ERROR nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 5e08de89-08df-4236-82f9-1588491bdb78, please check neutron logs for more information. [ 635.545083] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Traceback (most recent call last): [ 635.545083] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 635.545083] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] self.driver.spawn(context, instance, image_meta, [ 635.545083] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 635.545083] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] self._vmops.spawn(context, instance, image_meta, injected_files, [ 635.545083] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 635.545083] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] vm_ref = self.build_virtual_machine(instance, [ 635.545083] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 635.545083] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] vif_infos = vmwarevif.get_vif_info(self._session, [ 635.545083] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 635.545419] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] for vif in network_info: [ 635.545419] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 635.545419] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] return self._sync_wrapper(fn, *args, **kwargs) [ 635.545419] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 635.545419] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] self.wait() [ 635.545419] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 635.545419] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] self[:] = self._gt.wait() [ 635.545419] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 635.545419] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] return self._exit_event.wait() [ 635.545419] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 635.545419] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] result = hub.switch() [ 635.545419] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 635.545419] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] return self.greenlet.switch() [ 635.545419] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 635.545768] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] result = function(*args, **kwargs) [ 635.545768] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 635.545768] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] return func(*args, **kwargs) [ 635.545768] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 635.545768] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] raise e [ 635.545768] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 635.545768] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] nwinfo = self.network_api.allocate_for_instance( [ 635.545768] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 635.545768] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] created_port_ids = self._update_ports_for_instance( [ 635.545768] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 635.545768] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] with excutils.save_and_reraise_exception(): [ 635.545768] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 635.545768] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] self.force_reraise() [ 635.546088] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 635.546088] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] raise self.value [ 635.546088] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 635.546088] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] updated_port = self._update_port( [ 635.546088] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 635.546088] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] _ensure_no_port_binding_failure(port) [ 635.546088] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 635.546088] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] raise exception.PortBindingFailed(port_id=port['id']) [ 635.546088] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] nova.exception.PortBindingFailed: Binding failed for port 5e08de89-08df-4236-82f9-1588491bdb78, please check neutron logs for more information. [ 635.546088] env[59620]: ERROR nova.compute.manager [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] [ 635.546335] env[59620]: DEBUG nova.compute.utils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Binding failed for port 5e08de89-08df-4236-82f9-1588491bdb78, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 635.547937] env[59620]: DEBUG nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Build of instance 4fd28c4b-e5df-475b-bb3d-f163c9f5b436 was re-scheduled: Binding failed for port 5e08de89-08df-4236-82f9-1588491bdb78, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 635.548436] env[59620]: DEBUG nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 635.548849] env[59620]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Acquiring lock "refresh_cache-4fd28c4b-e5df-475b-bb3d-f163c9f5b436" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 635.549092] env[59620]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Acquired lock "refresh_cache-4fd28c4b-e5df-475b-bb3d-f163c9f5b436" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 635.549510] env[59620]: DEBUG nova.network.neutron [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 635.673187] env[59620]: DEBUG nova.network.neutron [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 636.279641] env[59620]: DEBUG nova.network.neutron [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.291903] env[59620]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Releasing lock "refresh_cache-4fd28c4b-e5df-475b-bb3d-f163c9f5b436" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 636.292160] env[59620]: DEBUG nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 636.292321] env[59620]: DEBUG nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 636.292478] env[59620]: DEBUG nova.network.neutron [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 636.415511] env[59620]: DEBUG nova.network.neutron [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 636.433920] env[59620]: DEBUG nova.network.neutron [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.449521] env[59620]: INFO nova.compute.manager [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] [instance: 4fd28c4b-e5df-475b-bb3d-f163c9f5b436] Took 0.16 seconds to deallocate network for instance. [ 636.590799] env[59620]: INFO nova.scheduler.client.report [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Deleted allocations for instance 4fd28c4b-e5df-475b-bb3d-f163c9f5b436 [ 636.619975] env[59620]: DEBUG oslo_concurrency.lockutils [None req-9b70fc6d-6afb-4564-b079-d74b8b7b8584 tempest-ServerDiagnosticsNegativeTest-1041573668 tempest-ServerDiagnosticsNegativeTest-1041573668-project-member] Lock "4fd28c4b-e5df-475b-bb3d-f163c9f5b436" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.754s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 637.023786] env[59620]: ERROR nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 9d54dbc2-c248-4bfc-a3d5-70d355561876, please check neutron logs for more information. [ 637.023786] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 637.023786] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 637.023786] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 637.023786] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 637.023786] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 637.023786] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 637.023786] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 637.023786] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 637.023786] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 637.023786] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 637.023786] env[59620]: ERROR nova.compute.manager raise self.value [ 637.023786] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 637.023786] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 637.023786] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 637.023786] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 637.024320] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 637.024320] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 637.024320] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 9d54dbc2-c248-4bfc-a3d5-70d355561876, please check neutron logs for more information. [ 637.024320] env[59620]: ERROR nova.compute.manager [ 637.024320] env[59620]: Traceback (most recent call last): [ 637.024320] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 637.024320] env[59620]: listener.cb(fileno) [ 637.024320] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 637.024320] env[59620]: result = function(*args, **kwargs) [ 637.024320] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 637.024320] env[59620]: return func(*args, **kwargs) [ 637.024320] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 637.024320] env[59620]: raise e [ 637.024320] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 637.024320] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 637.024320] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 637.024320] env[59620]: created_port_ids = self._update_ports_for_instance( [ 637.024320] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 637.024320] env[59620]: with excutils.save_and_reraise_exception(): [ 637.024320] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 637.024320] env[59620]: self.force_reraise() [ 637.024320] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 637.024320] env[59620]: raise self.value [ 637.024320] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 637.024320] env[59620]: updated_port = self._update_port( [ 637.024320] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 637.024320] env[59620]: _ensure_no_port_binding_failure(port) [ 637.024320] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 637.024320] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 637.025076] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 9d54dbc2-c248-4bfc-a3d5-70d355561876, please check neutron logs for more information. [ 637.025076] env[59620]: Removing descriptor: 15 [ 637.025076] env[59620]: ERROR nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 9d54dbc2-c248-4bfc-a3d5-70d355561876, please check neutron logs for more information. [ 637.025076] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Traceback (most recent call last): [ 637.025076] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 637.025076] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] yield resources [ 637.025076] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 637.025076] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] self.driver.spawn(context, instance, image_meta, [ 637.025076] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 637.025076] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 637.025076] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 637.025076] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] vm_ref = self.build_virtual_machine(instance, [ 637.025426] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 637.025426] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] vif_infos = vmwarevif.get_vif_info(self._session, [ 637.025426] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 637.025426] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] for vif in network_info: [ 637.025426] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 637.025426] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] return self._sync_wrapper(fn, *args, **kwargs) [ 637.025426] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 637.025426] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] self.wait() [ 637.025426] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 637.025426] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] self[:] = self._gt.wait() [ 637.025426] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 637.025426] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] return self._exit_event.wait() [ 637.025426] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 637.025426] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] result = hub.switch() [ 637.025805] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 637.025805] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] return self.greenlet.switch() [ 637.025805] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 637.025805] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] result = function(*args, **kwargs) [ 637.025805] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 637.025805] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] return func(*args, **kwargs) [ 637.025805] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 637.025805] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] raise e [ 637.025805] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 637.025805] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] nwinfo = self.network_api.allocate_for_instance( [ 637.025805] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 637.025805] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] created_port_ids = self._update_ports_for_instance( [ 637.025805] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 637.026184] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] with excutils.save_and_reraise_exception(): [ 637.026184] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 637.026184] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] self.force_reraise() [ 637.026184] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 637.026184] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] raise self.value [ 637.026184] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 637.026184] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] updated_port = self._update_port( [ 637.026184] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 637.026184] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] _ensure_no_port_binding_failure(port) [ 637.026184] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 637.026184] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] raise exception.PortBindingFailed(port_id=port['id']) [ 637.026184] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] nova.exception.PortBindingFailed: Binding failed for port 9d54dbc2-c248-4bfc-a3d5-70d355561876, please check neutron logs for more information. [ 637.026184] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] [ 637.026544] env[59620]: INFO nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Terminating instance [ 637.030331] env[59620]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Acquiring lock "refresh_cache-9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 637.030331] env[59620]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Acquired lock "refresh_cache-9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 637.030331] env[59620]: DEBUG nova.network.neutron [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 637.282240] env[59620]: DEBUG nova.network.neutron [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 637.721351] env[59620]: DEBUG nova.network.neutron [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 637.736117] env[59620]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Releasing lock "refresh_cache-9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 637.736117] env[59620]: DEBUG nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 637.736117] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 637.736117] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-95d4ba5e-1b2d-412a-b8af-00ae63a3a010 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.744846] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38683d71-f89b-4242-9842-614753bac825 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.769117] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e could not be found. [ 637.769353] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 637.769553] env[59620]: INFO nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 637.769756] env[59620]: DEBUG oslo.service.loopingcall [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 637.769996] env[59620]: DEBUG nova.compute.manager [-] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 637.770110] env[59620]: DEBUG nova.network.neutron [-] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 637.839197] env[59620]: DEBUG nova.network.neutron [-] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 637.846026] env[59620]: DEBUG nova.network.neutron [-] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 637.857134] env[59620]: INFO nova.compute.manager [-] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Took 0.09 seconds to deallocate network for instance. [ 637.860777] env[59620]: DEBUG nova.compute.claims [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 637.860962] env[59620]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 637.861190] env[59620]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 637.968708] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d67e709-bd18-4960-8c10-e70cd526db87 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.980845] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d881ce4-2c95-4a8d-958f-20f1ce972b20 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.020615] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4141374-44dc-4566-b005-807d256caec1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.029144] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f79c29b-cde4-4a44-924a-83ac20c18d34 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.044377] env[59620]: DEBUG nova.compute.provider_tree [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 638.057452] env[59620]: DEBUG nova.scheduler.client.report [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 638.073861] env[59620]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.213s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 638.074476] env[59620]: ERROR nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 9d54dbc2-c248-4bfc-a3d5-70d355561876, please check neutron logs for more information. [ 638.074476] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Traceback (most recent call last): [ 638.074476] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 638.074476] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] self.driver.spawn(context, instance, image_meta, [ 638.074476] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 638.074476] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 638.074476] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 638.074476] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] vm_ref = self.build_virtual_machine(instance, [ 638.074476] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 638.074476] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] vif_infos = vmwarevif.get_vif_info(self._session, [ 638.074476] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 638.074902] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] for vif in network_info: [ 638.074902] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 638.074902] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] return self._sync_wrapper(fn, *args, **kwargs) [ 638.074902] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 638.074902] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] self.wait() [ 638.074902] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 638.074902] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] self[:] = self._gt.wait() [ 638.074902] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 638.074902] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] return self._exit_event.wait() [ 638.074902] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 638.074902] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] result = hub.switch() [ 638.074902] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 638.074902] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] return self.greenlet.switch() [ 638.074902] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 638.075260] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] result = function(*args, **kwargs) [ 638.075260] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 638.075260] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] return func(*args, **kwargs) [ 638.075260] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 638.075260] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] raise e [ 638.075260] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 638.075260] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] nwinfo = self.network_api.allocate_for_instance( [ 638.075260] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 638.075260] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] created_port_ids = self._update_ports_for_instance( [ 638.075260] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 638.075260] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] with excutils.save_and_reraise_exception(): [ 638.075260] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 638.075260] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] self.force_reraise() [ 638.075592] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 638.075592] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] raise self.value [ 638.075592] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 638.075592] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] updated_port = self._update_port( [ 638.075592] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 638.075592] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] _ensure_no_port_binding_failure(port) [ 638.075592] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 638.075592] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] raise exception.PortBindingFailed(port_id=port['id']) [ 638.075592] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] nova.exception.PortBindingFailed: Binding failed for port 9d54dbc2-c248-4bfc-a3d5-70d355561876, please check neutron logs for more information. [ 638.075592] env[59620]: ERROR nova.compute.manager [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] [ 638.075843] env[59620]: DEBUG nova.compute.utils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Binding failed for port 9d54dbc2-c248-4bfc-a3d5-70d355561876, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 638.077520] env[59620]: DEBUG nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Build of instance 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e was re-scheduled: Binding failed for port 9d54dbc2-c248-4bfc-a3d5-70d355561876, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 638.078060] env[59620]: DEBUG nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 638.078191] env[59620]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Acquiring lock "refresh_cache-9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 638.078334] env[59620]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Acquired lock "refresh_cache-9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 638.078632] env[59620]: DEBUG nova.network.neutron [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 638.323138] env[59620]: DEBUG nova.network.neutron [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 638.525034] env[59620]: ERROR nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 252581c2-e5f7-471e-bfe2-b357c144e919, please check neutron logs for more information. [ 638.525034] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 638.525034] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 638.525034] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 638.525034] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 638.525034] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 638.525034] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 638.525034] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 638.525034] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 638.525034] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 638.525034] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 638.525034] env[59620]: ERROR nova.compute.manager raise self.value [ 638.525034] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 638.525034] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 638.525034] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 638.525034] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 638.526125] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 638.526125] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 638.526125] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 252581c2-e5f7-471e-bfe2-b357c144e919, please check neutron logs for more information. [ 638.526125] env[59620]: ERROR nova.compute.manager [ 638.526125] env[59620]: Traceback (most recent call last): [ 638.526125] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 638.526125] env[59620]: listener.cb(fileno) [ 638.526125] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 638.526125] env[59620]: result = function(*args, **kwargs) [ 638.526125] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 638.526125] env[59620]: return func(*args, **kwargs) [ 638.526125] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 638.526125] env[59620]: raise e [ 638.526125] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 638.526125] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 638.526125] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 638.526125] env[59620]: created_port_ids = self._update_ports_for_instance( [ 638.526125] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 638.526125] env[59620]: with excutils.save_and_reraise_exception(): [ 638.526125] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 638.526125] env[59620]: self.force_reraise() [ 638.526125] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 638.526125] env[59620]: raise self.value [ 638.526125] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 638.526125] env[59620]: updated_port = self._update_port( [ 638.526125] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 638.526125] env[59620]: _ensure_no_port_binding_failure(port) [ 638.526125] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 638.526125] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 638.526925] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 252581c2-e5f7-471e-bfe2-b357c144e919, please check neutron logs for more information. [ 638.526925] env[59620]: Removing descriptor: 16 [ 638.526925] env[59620]: ERROR nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 252581c2-e5f7-471e-bfe2-b357c144e919, please check neutron logs for more information. [ 638.526925] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Traceback (most recent call last): [ 638.526925] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 638.526925] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] yield resources [ 638.526925] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 638.526925] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] self.driver.spawn(context, instance, image_meta, [ 638.526925] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 638.526925] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] self._vmops.spawn(context, instance, image_meta, injected_files, [ 638.526925] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 638.526925] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] vm_ref = self.build_virtual_machine(instance, [ 638.527277] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 638.527277] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] vif_infos = vmwarevif.get_vif_info(self._session, [ 638.527277] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 638.527277] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] for vif in network_info: [ 638.527277] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 638.527277] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] return self._sync_wrapper(fn, *args, **kwargs) [ 638.527277] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 638.527277] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] self.wait() [ 638.527277] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 638.527277] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] self[:] = self._gt.wait() [ 638.527277] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 638.527277] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] return self._exit_event.wait() [ 638.527277] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 638.527277] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] result = hub.switch() [ 638.527630] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 638.527630] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] return self.greenlet.switch() [ 638.527630] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 638.527630] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] result = function(*args, **kwargs) [ 638.527630] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 638.527630] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] return func(*args, **kwargs) [ 638.527630] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 638.527630] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] raise e [ 638.527630] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 638.527630] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] nwinfo = self.network_api.allocate_for_instance( [ 638.527630] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 638.527630] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] created_port_ids = self._update_ports_for_instance( [ 638.527630] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 638.527975] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] with excutils.save_and_reraise_exception(): [ 638.527975] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 638.527975] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] self.force_reraise() [ 638.527975] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 638.527975] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] raise self.value [ 638.527975] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 638.527975] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] updated_port = self._update_port( [ 638.527975] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 638.527975] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] _ensure_no_port_binding_failure(port) [ 638.527975] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 638.527975] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] raise exception.PortBindingFailed(port_id=port['id']) [ 638.527975] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] nova.exception.PortBindingFailed: Binding failed for port 252581c2-e5f7-471e-bfe2-b357c144e919, please check neutron logs for more information. [ 638.527975] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] [ 638.528325] env[59620]: INFO nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Terminating instance [ 638.532642] env[59620]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Acquiring lock "refresh_cache-b1a72905-ae94-42e3-8926-0f81cb502942" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 638.532642] env[59620]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Acquired lock "refresh_cache-b1a72905-ae94-42e3-8926-0f81cb502942" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 638.532642] env[59620]: DEBUG nova.network.neutron [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 638.626680] env[59620]: DEBUG nova.network.neutron [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 638.862312] env[59620]: DEBUG nova.network.neutron [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 638.872777] env[59620]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Releasing lock "refresh_cache-9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 638.873078] env[59620]: DEBUG nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 638.873209] env[59620]: DEBUG nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 638.873368] env[59620]: DEBUG nova.network.neutron [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 638.935689] env[59620]: DEBUG nova.network.neutron [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 638.945606] env[59620]: DEBUG nova.network.neutron [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 638.952496] env[59620]: INFO nova.compute.manager [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] [instance: 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e] Took 0.08 seconds to deallocate network for instance. [ 639.030206] env[59620]: DEBUG nova.network.neutron [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 639.048361] env[59620]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Releasing lock "refresh_cache-b1a72905-ae94-42e3-8926-0f81cb502942" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 639.048758] env[59620]: DEBUG nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 639.048979] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 639.049887] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-67ba23a8-9b75-45dc-8fdc-b6ba292c40dc {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.061495] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cb6d432-cdd3-41eb-9d7b-1e7b4449a1f9 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.073916] env[59620]: INFO nova.scheduler.client.report [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Deleted allocations for instance 9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e [ 639.090748] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b1a72905-ae94-42e3-8926-0f81cb502942 could not be found. [ 639.091228] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 639.091476] env[59620]: INFO nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Took 0.04 seconds to destroy the instance on the hypervisor. [ 639.091766] env[59620]: DEBUG oslo.service.loopingcall [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 639.092562] env[59620]: DEBUG nova.compute.manager [-] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 639.092644] env[59620]: DEBUG nova.network.neutron [-] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 639.094675] env[59620]: DEBUG oslo_concurrency.lockutils [None req-809cd186-fb47-4352-a103-a7f5850453a7 tempest-AttachInterfacesUnderV243Test-1609256439 tempest-AttachInterfacesUnderV243Test-1609256439-project-member] Lock "9c4dffc8-3065-46ea-bdbf-49aa5bbaff6e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.775s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 639.142795] env[59620]: DEBUG nova.network.neutron [-] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 639.155037] env[59620]: DEBUG nova.network.neutron [-] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 639.166744] env[59620]: INFO nova.compute.manager [-] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Took 0.07 seconds to deallocate network for instance. [ 639.168559] env[59620]: DEBUG nova.compute.claims [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 639.170035] env[59620]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 639.170035] env[59620]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 639.306213] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b33438f3-0d78-4d08-aa6b-3548c608eca8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.315634] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26d60a8f-8eab-46ec-a2cd-961f03a805e9 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.352896] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cedfb619-7668-4ede-87e5-9281261e2361 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.362025] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e59d1bbc-4418-49f2-8d26-3ea5355ae886 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.375034] env[59620]: DEBUG nova.compute.provider_tree [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 639.387565] env[59620]: DEBUG nova.scheduler.client.report [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 639.409652] env[59620]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.240s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 639.410316] env[59620]: ERROR nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 252581c2-e5f7-471e-bfe2-b357c144e919, please check neutron logs for more information. [ 639.410316] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Traceback (most recent call last): [ 639.410316] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 639.410316] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] self.driver.spawn(context, instance, image_meta, [ 639.410316] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 639.410316] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] self._vmops.spawn(context, instance, image_meta, injected_files, [ 639.410316] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 639.410316] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] vm_ref = self.build_virtual_machine(instance, [ 639.410316] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 639.410316] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] vif_infos = vmwarevif.get_vif_info(self._session, [ 639.410316] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 639.410696] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] for vif in network_info: [ 639.410696] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 639.410696] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] return self._sync_wrapper(fn, *args, **kwargs) [ 639.410696] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 639.410696] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] self.wait() [ 639.410696] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 639.410696] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] self[:] = self._gt.wait() [ 639.410696] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 639.410696] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] return self._exit_event.wait() [ 639.410696] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 639.410696] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] result = hub.switch() [ 639.410696] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 639.410696] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] return self.greenlet.switch() [ 639.410696] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 639.411150] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] result = function(*args, **kwargs) [ 639.411150] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 639.411150] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] return func(*args, **kwargs) [ 639.411150] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 639.411150] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] raise e [ 639.411150] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 639.411150] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] nwinfo = self.network_api.allocate_for_instance( [ 639.411150] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 639.411150] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] created_port_ids = self._update_ports_for_instance( [ 639.411150] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 639.411150] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] with excutils.save_and_reraise_exception(): [ 639.411150] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 639.411150] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] self.force_reraise() [ 639.411624] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 639.411624] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] raise self.value [ 639.411624] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 639.411624] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] updated_port = self._update_port( [ 639.411624] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 639.411624] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] _ensure_no_port_binding_failure(port) [ 639.411624] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 639.411624] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] raise exception.PortBindingFailed(port_id=port['id']) [ 639.411624] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] nova.exception.PortBindingFailed: Binding failed for port 252581c2-e5f7-471e-bfe2-b357c144e919, please check neutron logs for more information. [ 639.411624] env[59620]: ERROR nova.compute.manager [instance: b1a72905-ae94-42e3-8926-0f81cb502942] [ 639.411924] env[59620]: DEBUG nova.compute.utils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Binding failed for port 252581c2-e5f7-471e-bfe2-b357c144e919, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 639.415891] env[59620]: DEBUG nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Build of instance b1a72905-ae94-42e3-8926-0f81cb502942 was re-scheduled: Binding failed for port 252581c2-e5f7-471e-bfe2-b357c144e919, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 639.415891] env[59620]: DEBUG nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 639.415891] env[59620]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Acquiring lock "refresh_cache-b1a72905-ae94-42e3-8926-0f81cb502942" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 639.416117] env[59620]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Acquired lock "refresh_cache-b1a72905-ae94-42e3-8926-0f81cb502942" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 639.416117] env[59620]: DEBUG nova.network.neutron [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 639.651268] env[59620]: DEBUG nova.network.neutron [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 639.930131] env[59620]: DEBUG nova.network.neutron [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 639.955665] env[59620]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Releasing lock "refresh_cache-b1a72905-ae94-42e3-8926-0f81cb502942" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 639.955930] env[59620]: DEBUG nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 639.956077] env[59620]: DEBUG nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 639.956258] env[59620]: DEBUG nova.network.neutron [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 639.981978] env[59620]: DEBUG nova.network.neutron [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 639.993094] env[59620]: DEBUG nova.network.neutron [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.002850] env[59620]: INFO nova.compute.manager [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] [instance: b1a72905-ae94-42e3-8926-0f81cb502942] Took 0.05 seconds to deallocate network for instance. [ 640.103901] env[59620]: INFO nova.scheduler.client.report [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Deleted allocations for instance b1a72905-ae94-42e3-8926-0f81cb502942 [ 640.123407] env[59620]: DEBUG oslo_concurrency.lockutils [None req-ee93ddfa-1356-4c34-a435-08d39e0f65de tempest-ServersV294TestFqdnHostnames-1734968333 tempest-ServersV294TestFqdnHostnames-1734968333-project-member] Lock "b1a72905-ae94-42e3-8926-0f81cb502942" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.652s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 657.214685] env[59620]: WARNING oslo_vmware.rw_handles [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 657.214685] env[59620]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 657.214685] env[59620]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 657.214685] env[59620]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 657.214685] env[59620]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 657.214685] env[59620]: ERROR oslo_vmware.rw_handles response.begin() [ 657.214685] env[59620]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 657.214685] env[59620]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 657.214685] env[59620]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 657.214685] env[59620]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 657.214685] env[59620]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 657.214685] env[59620]: ERROR oslo_vmware.rw_handles [ 657.217654] env[59620]: DEBUG nova.virt.vmwareapi.images [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Downloaded image file data 2efa4364-ba59-4de9-978f-169a769ee710 to vmware_temp/babc7654-fad0-49d7-893b-2b5cb64b39b0/2efa4364-ba59-4de9-978f-169a769ee710/tmp-sparse.vmdk on the data store datastore1 {{(pid=59620) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 657.221017] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Caching image {{(pid=59620) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 657.221017] env[59620]: DEBUG nova.virt.vmwareapi.vm_util [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Copying Virtual Disk [datastore1] vmware_temp/babc7654-fad0-49d7-893b-2b5cb64b39b0/2efa4364-ba59-4de9-978f-169a769ee710/tmp-sparse.vmdk to [datastore1] vmware_temp/babc7654-fad0-49d7-893b-2b5cb64b39b0/2efa4364-ba59-4de9-978f-169a769ee710/2efa4364-ba59-4de9-978f-169a769ee710.vmdk {{(pid=59620) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 657.221017] env[59620]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e4a3fc13-22c8-4829-ab5f-a19c164a2f05 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.228894] env[59620]: DEBUG oslo_vmware.api [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Waiting for the task: (returnval){ [ 657.228894] env[59620]: value = "task-1308613" [ 657.228894] env[59620]: _type = "Task" [ 657.228894] env[59620]: } to complete. {{(pid=59620) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 657.241452] env[59620]: DEBUG oslo_vmware.api [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Task: {'id': task-1308613, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59620) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 657.739690] env[59620]: DEBUG oslo_vmware.exceptions [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Fault InvalidArgument not matched. {{(pid=59620) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 657.739948] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2efa4364-ba59-4de9-978f-169a769ee710/2efa4364-ba59-4de9-978f-169a769ee710.vmdk" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 657.741035] env[59620]: ERROR nova.compute.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 657.741035] env[59620]: Faults: ['InvalidArgument'] [ 657.741035] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Traceback (most recent call last): [ 657.741035] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 657.741035] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] yield resources [ 657.741035] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 657.741035] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] self.driver.spawn(context, instance, image_meta, [ 657.741035] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 657.741035] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] self._vmops.spawn(context, instance, image_meta, injected_files, [ 657.741035] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 657.741035] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] self._fetch_image_if_missing(context, vi) [ 657.741035] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 657.741421] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] image_cache(vi, tmp_image_ds_loc) [ 657.741421] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 657.741421] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] vm_util.copy_virtual_disk( [ 657.741421] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 657.741421] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] session._wait_for_task(vmdk_copy_task) [ 657.741421] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 657.741421] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] return self.wait_for_task(task_ref) [ 657.741421] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 657.741421] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] return evt.wait() [ 657.741421] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 657.741421] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] result = hub.switch() [ 657.741421] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 657.741421] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] return self.greenlet.switch() [ 657.742103] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 657.742103] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] self.f(*self.args, **self.kw) [ 657.742103] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 657.742103] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] raise exceptions.translate_fault(task_info.error) [ 657.742103] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 657.742103] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Faults: ['InvalidArgument'] [ 657.742103] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] [ 657.742103] env[59620]: INFO nova.compute.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Terminating instance [ 657.743761] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquiring lock "refresh_cache-2f8a30ee-d22b-42e6-abe6-db22d9a6fe73" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 657.743972] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquired lock "refresh_cache-2f8a30ee-d22b-42e6-abe6-db22d9a6fe73" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 657.744077] env[59620]: DEBUG nova.network.neutron [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 657.928601] env[59620]: DEBUG nova.network.neutron [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 658.596741] env[59620]: DEBUG nova.network.neutron [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 658.612417] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Releasing lock "refresh_cache-2f8a30ee-d22b-42e6-abe6-db22d9a6fe73" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 658.612837] env[59620]: DEBUG nova.compute.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 658.613051] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 658.614267] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02a3064d-3265-41ac-b944-e00074a2b080 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.622534] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Unregistering the VM {{(pid=59620) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 658.623118] env[59620]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0b8d6124-babc-4f1b-b6ac-4e3d12a8bbfd {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.658040] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Unregistered the VM {{(pid=59620) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 658.658040] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Deleting contents of the VM from datastore datastore1 {{(pid=59620) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 658.658040] env[59620]: DEBUG nova.virt.vmwareapi.ds_util [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Deleting the datastore file [datastore1] 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73 {{(pid=59620) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 658.658040] env[59620]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7e6fa4a4-e01f-479c-addc-c9e3d0e94edf {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.667722] env[59620]: DEBUG oslo_vmware.api [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Waiting for the task: (returnval){ [ 658.667722] env[59620]: value = "task-1308615" [ 658.667722] env[59620]: _type = "Task" [ 658.667722] env[59620]: } to complete. {{(pid=59620) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 658.679552] env[59620]: DEBUG oslo_vmware.api [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Task: {'id': task-1308615, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59620) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 659.178753] env[59620]: DEBUG oslo_vmware.api [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Task: {'id': task-1308615, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.034199} completed successfully. {{(pid=59620) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 659.180092] env[59620]: DEBUG nova.virt.vmwareapi.ds_util [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Deleted the datastore file {{(pid=59620) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 659.180092] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Deleted contents of the VM from datastore datastore1 {{(pid=59620) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 659.180092] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 659.180092] env[59620]: INFO nova.compute.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Took 0.57 seconds to destroy the instance on the hypervisor. [ 659.180092] env[59620]: DEBUG oslo.service.loopingcall [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 659.180521] env[59620]: DEBUG nova.compute.manager [-] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Skipping network deallocation for instance since networking was not requested. {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 659.182366] env[59620]: DEBUG nova.compute.claims [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 659.182576] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 659.182741] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 659.251933] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a789174-3293-4d37-9ae6-d13132f3cc8f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 659.261290] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a5eac7b-3043-4a72-bbc8-bd6eb74b766a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 659.294868] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4c7f032-d7ff-44c3-8508-a28a6eb09a18 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 659.304027] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b66db539-4e7d-404c-96ec-ffbf66fcf897 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 659.318990] env[59620]: DEBUG nova.compute.provider_tree [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 659.333175] env[59620]: DEBUG nova.scheduler.client.report [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 659.352797] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.170s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 659.353353] env[59620]: ERROR nova.compute.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 659.353353] env[59620]: Faults: ['InvalidArgument'] [ 659.353353] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Traceback (most recent call last): [ 659.353353] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 659.353353] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] self.driver.spawn(context, instance, image_meta, [ 659.353353] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 659.353353] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] self._vmops.spawn(context, instance, image_meta, injected_files, [ 659.353353] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 659.353353] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] self._fetch_image_if_missing(context, vi) [ 659.353353] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 659.353353] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] image_cache(vi, tmp_image_ds_loc) [ 659.353353] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 659.353788] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] vm_util.copy_virtual_disk( [ 659.353788] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 659.353788] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] session._wait_for_task(vmdk_copy_task) [ 659.353788] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 659.353788] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] return self.wait_for_task(task_ref) [ 659.353788] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 659.353788] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] return evt.wait() [ 659.353788] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 659.353788] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] result = hub.switch() [ 659.353788] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 659.353788] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] return self.greenlet.switch() [ 659.353788] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 659.353788] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] self.f(*self.args, **self.kw) [ 659.354229] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 659.354229] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] raise exceptions.translate_fault(task_info.error) [ 659.354229] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 659.354229] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Faults: ['InvalidArgument'] [ 659.354229] env[59620]: ERROR nova.compute.manager [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] [ 659.354395] env[59620]: DEBUG nova.compute.utils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] VimFaultException {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 659.358757] env[59620]: DEBUG nova.compute.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Build of instance 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73 was re-scheduled: A specified parameter was not correct: fileType [ 659.358757] env[59620]: Faults: ['InvalidArgument'] {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 659.359191] env[59620]: DEBUG nova.compute.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 659.359421] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquiring lock "refresh_cache-2f8a30ee-d22b-42e6-abe6-db22d9a6fe73" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 659.359564] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Acquired lock "refresh_cache-2f8a30ee-d22b-42e6-abe6-db22d9a6fe73" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 659.359717] env[59620]: DEBUG nova.network.neutron [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 659.456466] env[59620]: DEBUG nova.network.neutron [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 659.936691] env[59620]: DEBUG nova.network.neutron [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 659.949832] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Releasing lock "refresh_cache-2f8a30ee-d22b-42e6-abe6-db22d9a6fe73" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 659.950082] env[59620]: DEBUG nova.compute.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 659.950443] env[59620]: DEBUG nova.compute.manager [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] Skipping network deallocation for instance since networking was not requested. {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 660.062695] env[59620]: INFO nova.scheduler.client.report [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Deleted allocations for instance 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73 [ 660.096612] env[59620]: DEBUG oslo_concurrency.lockutils [None req-988b6bef-bb21-4c07-9beb-f939cd0f3573 tempest-ServerDiagnosticsV248Test-2041210563 tempest-ServerDiagnosticsV248Test-2041210563-project-member] Lock "2f8a30ee-d22b-42e6-abe6-db22d9a6fe73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 54.118s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 660.096834] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "2f8a30ee-d22b-42e6-abe6-db22d9a6fe73" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 53.869s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 660.097171] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-73bf3f91-1c1e-407d-93fc-fbe11fbbae23 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 660.112335] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-760df015-8176-4b3e-aeac-9411598e2256 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 660.170848] env[59620]: INFO nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] [instance: 2f8a30ee-d22b-42e6-abe6-db22d9a6fe73] During the sync_power process the instance has moved from host None to host cpu-1 [ 660.171015] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "2f8a30ee-d22b-42e6-abe6-db22d9a6fe73" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.074s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 678.367794] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 678.367794] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 678.385725] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 678.385725] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Starting heal instance info cache {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 678.385725] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Rebuilding the list of instances to heal {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 678.403148] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Didn't find any instances for network info cache update. {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 678.403373] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 678.403527] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 678.403668] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 678.404197] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59620) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 679.960885] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.960885] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.960885] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.960885] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager.update_available_resource {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.981481] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 679.981481] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 679.981553] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 679.981678] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59620) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 679.983225] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebe47719-8bea-40e3-87b2-8c29508deeca {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.994697] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a097f517-c61f-4f52-9976-765dbcbec916 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.010092] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84870aeb-bb29-4b4c-91a2-45f60da0d119 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.016945] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e0657ec-7aba-4ea1-8cfa-d25e09b0abe7 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.048360] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181496MB free_disk=136GB free_vcpus=48 pci_devices=None {{(pid=59620) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 680.048572] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 680.048803] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 680.102474] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59620) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 680.102647] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59620) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 680.119418] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a80443eb-30f9-4639-84e6-c3b297c216df {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.127121] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c16caa6c-314b-4929-a464-2bea105f7663 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.159481] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0851066-2a7f-4a41-b6f3-f3bcfb976522 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.167609] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fd6cb77-3736-4ec0-8f72-129aa02d4ce2 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.181936] env[59620]: DEBUG nova.compute.provider_tree [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 680.191863] env[59620]: DEBUG nova.scheduler.client.report [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 680.205115] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59620) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 680.205115] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.156s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 688.197346] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Acquiring lock "763e888f-2290-4ca2-ab8e-703450a21e35" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.197710] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "763e888f-2290-4ca2-ab8e-703450a21e35" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.208111] env[59620]: DEBUG nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 688.265035] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.265293] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.266752] env[59620]: INFO nova.compute.claims [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 688.346454] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0dc6d71-8056-4caa-b8e5-af95ae547828 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.354911] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b42a7627-c8f9-4435-90e0-865a1956d081 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.393546] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdefe570-1aee-49b6-a2dd-a53ce9da7506 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.402517] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c31b07d-b3c0-437f-9a86-9a50cf2d7e48 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.417805] env[59620]: DEBUG nova.compute.provider_tree [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 688.429171] env[59620]: DEBUG nova.scheduler.client.report [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 688.444652] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.179s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 688.445160] env[59620]: DEBUG nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 688.485011] env[59620]: DEBUG nova.compute.utils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 688.486537] env[59620]: DEBUG nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 688.486710] env[59620]: DEBUG nova.network.neutron [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 688.500061] env[59620]: DEBUG nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 688.575402] env[59620]: DEBUG nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 688.601722] env[59620]: DEBUG nova.virt.hardware [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 688.602087] env[59620]: DEBUG nova.virt.hardware [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 688.602248] env[59620]: DEBUG nova.virt.hardware [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 688.602426] env[59620]: DEBUG nova.virt.hardware [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 688.602645] env[59620]: DEBUG nova.virt.hardware [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 688.602803] env[59620]: DEBUG nova.virt.hardware [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 688.603018] env[59620]: DEBUG nova.virt.hardware [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 688.603175] env[59620]: DEBUG nova.virt.hardware [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 688.603335] env[59620]: DEBUG nova.virt.hardware [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 688.603489] env[59620]: DEBUG nova.virt.hardware [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 688.603654] env[59620]: DEBUG nova.virt.hardware [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 688.604511] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0c62648-09d3-48b9-9f19-b370d96693a1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.613932] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc653459-b92d-4311-aa8d-02cfbc2045e0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.712315] env[59620]: DEBUG nova.policy [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8b1d847b0d24854b7d6dae9b1e0d809', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6313f0295f5f49a8a507833c28e32831', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 690.246125] env[59620]: DEBUG nova.network.neutron [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Successfully created port: ad221b8a-6737-4342-b2e8-d01b33365352 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 690.370138] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.370138] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.382387] env[59620]: DEBUG nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 690.450361] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.450361] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.450361] env[59620]: INFO nova.compute.claims [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 690.593228] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-760c2d13-8bdf-451e-a4c4-78b3169b05c1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.606246] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e9b5810-3534-421a-ac46-2605e3457341 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.643706] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ef843d8-6750-4fdb-93b5-f651b6ee8c83 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.652896] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1c796ee-d0a3-4fc9-b21f-e6c02dc88994 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.668634] env[59620]: DEBUG nova.compute.provider_tree [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 690.679294] env[59620]: DEBUG nova.scheduler.client.report [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 690.697984] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.250s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.699706] env[59620]: DEBUG nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 690.743244] env[59620]: DEBUG nova.compute.utils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 690.745257] env[59620]: DEBUG nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 690.745291] env[59620]: DEBUG nova.network.neutron [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 690.765541] env[59620]: DEBUG nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 690.850807] env[59620]: DEBUG nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 690.879801] env[59620]: DEBUG nova.virt.hardware [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 690.880060] env[59620]: DEBUG nova.virt.hardware [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 690.880227] env[59620]: DEBUG nova.virt.hardware [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 690.882316] env[59620]: DEBUG nova.virt.hardware [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 690.882316] env[59620]: DEBUG nova.virt.hardware [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 690.882441] env[59620]: DEBUG nova.virt.hardware [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 690.884615] env[59620]: DEBUG nova.virt.hardware [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 690.884615] env[59620]: DEBUG nova.virt.hardware [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 690.884615] env[59620]: DEBUG nova.virt.hardware [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 690.884615] env[59620]: DEBUG nova.virt.hardware [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 690.884615] env[59620]: DEBUG nova.virt.hardware [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 690.884801] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f07adfce-dd6a-4fcc-8323-763c20afb2df {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.893786] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48d16afe-9585-424e-ab8f-828812dd5f41 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.973759] env[59620]: DEBUG nova.policy [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6316c8b7da8d4d3c97b2693b33729c52', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cc4c8738af2b48f981e5f2feadb41a59', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 691.405025] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "8aec9e05-7685-4895-b375-6f5cd45e7a5f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.405025] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "8aec9e05-7685-4895-b375-6f5cd45e7a5f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.418795] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 691.442157] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "9807a449-4cca-416c-815d-99d5bc674464" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.442360] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "9807a449-4cca-416c-815d-99d5bc674464" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.466657] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 691.497166] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.497166] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.498661] env[59620]: INFO nova.compute.claims [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 691.549414] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.657130] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d15cfba3-47e8-4e00-bbe5-5cf317d8ba59 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.666432] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96593f31-73e5-4eba-9603-4f9a9f3dc8de {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.702282] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99f43caf-d73f-478c-a29d-7f50125aad46 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.710254] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-398c018f-da98-4ff7-a3bf-a0684644b395 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.725216] env[59620]: DEBUG nova.compute.provider_tree [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 691.740490] env[59620]: DEBUG nova.scheduler.client.report [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 691.761656] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.762182] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 691.764845] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.215s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.766233] env[59620]: INFO nova.compute.claims [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 691.812117] env[59620]: DEBUG nova.compute.utils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 691.813715] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 691.815892] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 691.824165] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 691.932655] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 691.940475] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-875f686f-42e4-4483-bf58-c82012e4a38d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.949638] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79018139-c70b-4076-b21b-44bd2294c67f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.985419] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 691.985631] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 691.985795] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 691.985973] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 691.986125] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 691.986267] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 691.986465] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 691.986620] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 691.986781] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 691.986935] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 691.987110] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 691.988599] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b5a3959-3816-47ef-beff-db0cfd3e082b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.991714] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1458c177-cd7f-4c8b-b924-0b50bbc5a181 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 692.000799] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30e4fb67-31cf-4ae6-a36f-23802f1c6a45 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 692.005706] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19a0cd7a-2bdd-48bc-ab13-afd560595be1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 692.029944] env[59620]: DEBUG nova.compute.provider_tree [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 692.048888] env[59620]: DEBUG nova.scheduler.client.report [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 692.071733] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.307s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 692.072212] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 692.116252] env[59620]: DEBUG nova.compute.utils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 692.116252] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 692.116495] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 692.129657] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 692.236563] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 692.268433] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 692.269154] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 692.269154] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 692.269154] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 692.272630] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 692.272798] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 692.273027] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 692.273183] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 692.273342] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 692.273493] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 692.273653] env[59620]: DEBUG nova.virt.hardware [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 692.274837] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a59c234-8ca8-4281-b4aa-7fee894b526b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 692.285486] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4baa221f-16ca-474a-b1ac-da48acf5a766 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 692.303646] env[59620]: DEBUG nova.policy [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfbe2267ded84c71b3af181cb852d581', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1de6a55b95aa4af2865ec70142a20326', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 692.309314] env[59620]: DEBUG nova.policy [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfbe2267ded84c71b3af181cb852d581', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1de6a55b95aa4af2865ec70142a20326', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 692.397949] env[59620]: DEBUG nova.network.neutron [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Successfully created port: 4ccfce69-0ece-46b9-9cc5-a7451971a93d {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 692.498435] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Acquiring lock "3157e7e4-fe8e-42b6-891b-ae0333b25f33" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 692.498711] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "3157e7e4-fe8e-42b6-891b-ae0333b25f33" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 692.512165] env[59620]: DEBUG nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 692.574718] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 692.574718] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 692.575969] env[59620]: INFO nova.compute.claims [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 692.725802] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-741261b4-9958-4804-adc4-d0fb2052eea2 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 692.733141] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-947d3075-8391-4683-9b4a-ee5691c4e78f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 692.765580] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce25cb46-92c6-4adf-bfdb-c2e88e29726f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 692.772844] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2194948-b80e-4e6a-8820-f6da073d367b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 692.785796] env[59620]: DEBUG nova.compute.provider_tree [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 692.795387] env[59620]: DEBUG nova.scheduler.client.report [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 692.810109] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 692.810483] env[59620]: DEBUG nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 692.856264] env[59620]: DEBUG nova.compute.utils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 692.856947] env[59620]: DEBUG nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 692.857085] env[59620]: DEBUG nova.network.neutron [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 692.866115] env[59620]: DEBUG nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 692.908814] env[59620]: INFO nova.virt.block_device [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Booting with volume 070f46bc-c5c6-4781-b1c7-8b7202fb2254 at /dev/sda [ 692.941571] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Successfully created port: 36365480-2ece-4649-a6c2-b796389d5a15 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 692.955572] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a1123bce-4523-419e-a158-fa719755bf11 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 692.968065] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ea0303e-0c68-4264-97f8-f27f2b5523a4 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 692.994015] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-58d57a55-6cdb-43c2-b09f-2267d28d4b47 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.000584] env[59620]: DEBUG nova.policy [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fea9ad8b418d4797b0374a3b8da2cd95', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6feb8151289a4819ada69e3e87e3e27e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 693.005550] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a7490d3-4773-4509-ba67-3686eb8cf82f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.030709] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c92496d-4abb-4e98-b85f-f8e6ce4b7a00 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.037672] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0de4e7e9-1e2a-4bd0-bdd2-c24e6831f164 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.054899] env[59620]: DEBUG nova.virt.block_device [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Updating existing volume attachment record: d5344833-600a-46fe-8eaf-0b2c1cb157a1 {{(pid=59620) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 693.351129] env[59620]: DEBUG nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 693.351129] env[59620]: DEBUG nova.virt.hardware [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 693.351129] env[59620]: DEBUG nova.virt.hardware [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 693.351283] env[59620]: DEBUG nova.virt.hardware [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 693.351756] env[59620]: DEBUG nova.virt.hardware [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 693.352103] env[59620]: DEBUG nova.virt.hardware [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 693.352353] env[59620]: DEBUG nova.virt.hardware [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 693.352648] env[59620]: DEBUG nova.virt.hardware [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 693.352896] env[59620]: DEBUG nova.virt.hardware [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 693.353159] env[59620]: DEBUG nova.virt.hardware [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 693.354972] env[59620]: DEBUG nova.virt.hardware [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 693.354972] env[59620]: DEBUG nova.virt.hardware [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 693.354972] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b7e0a5c-37b5-41dd-a12f-abcaee6b0871 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.362228] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Acquiring lock "1015f7da-bc69-489b-bb38-b31c1fe919a8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.362561] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "1015f7da-bc69-489b-bb38-b31c1fe919a8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.367817] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c4085fa-4e3e-4ce8-88be-aebc65f83fae {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.383106] env[59620]: DEBUG nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 693.438020] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.438020] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.438020] env[59620]: INFO nova.compute.claims [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 693.602111] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b017e3f-7372-4b10-903a-88c344800cc0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.612019] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de02609b-4342-458e-93e3-1189df63dbd5 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.643494] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-234a1968-8120-4828-b65c-a2083f7229eb {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.651594] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fc635bf-8e10-428f-ba81-20b46ae87d84 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.665395] env[59620]: DEBUG nova.compute.provider_tree [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 693.674387] env[59620]: DEBUG nova.scheduler.client.report [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 693.691759] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.692355] env[59620]: DEBUG nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 693.732457] env[59620]: DEBUG nova.compute.utils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 693.733795] env[59620]: DEBUG nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 693.733887] env[59620]: DEBUG nova.network.neutron [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 693.746469] env[59620]: DEBUG nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 693.837022] env[59620]: DEBUG nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 693.870370] env[59620]: DEBUG nova.virt.hardware [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 693.870441] env[59620]: DEBUG nova.virt.hardware [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 693.870923] env[59620]: DEBUG nova.virt.hardware [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 693.870923] env[59620]: DEBUG nova.virt.hardware [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 693.870923] env[59620]: DEBUG nova.virt.hardware [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 693.872251] env[59620]: DEBUG nova.virt.hardware [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 693.872577] env[59620]: DEBUG nova.virt.hardware [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 693.872794] env[59620]: DEBUG nova.virt.hardware [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 693.873331] env[59620]: DEBUG nova.virt.hardware [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 693.873586] env[59620]: DEBUG nova.virt.hardware [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 693.873880] env[59620]: DEBUG nova.virt.hardware [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 693.874991] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44512d23-eb55-4afa-909c-cbeda43aedb4 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.885178] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bcfe4f8-c492-47e6-b01d-79e233607f7f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.929787] env[59620]: DEBUG nova.policy [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '63112604b7d94a4aa441652bfcb9aed6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ed29a28397c34c4491790f867effce4e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 693.971926] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Successfully created port: 8fceeed4-872d-49b7-ad0b-e79723805903 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 694.767351] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "d65656c5-2cdb-4152-8e47-20d182d39c7a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.767782] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "d65656c5-2cdb-4152-8e47-20d182d39c7a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.779472] env[59620]: DEBUG nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 694.832274] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.832521] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.834315] env[59620]: INFO nova.compute.claims [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 695.024998] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c16ae29-5f14-4999-84bc-a8f6d3f39de0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.035067] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c5f20d6-1225-4cf2-a669-9169fe26f9f9 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.039444] env[59620]: DEBUG nova.network.neutron [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Successfully created port: c6a56c1d-139f-43f1-9a39-a337feab33b5 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 695.068377] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8da2fb7-3d22-42bf-9f90-b891beca79f7 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.075761] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb234261-ea45-45ad-a752-290114ac9ff6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.088890] env[59620]: DEBUG nova.compute.provider_tree [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 695.097465] env[59620]: DEBUG nova.scheduler.client.report [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 695.113153] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 695.113362] env[59620]: DEBUG nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 695.155859] env[59620]: DEBUG nova.compute.utils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 695.157240] env[59620]: DEBUG nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 695.157409] env[59620]: DEBUG nova.network.neutron [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 695.166474] env[59620]: DEBUG nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 695.238230] env[59620]: DEBUG nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 695.268544] env[59620]: DEBUG nova.virt.hardware [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 695.268842] env[59620]: DEBUG nova.virt.hardware [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 695.268920] env[59620]: DEBUG nova.virt.hardware [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 695.269086] env[59620]: DEBUG nova.virt.hardware [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 695.269229] env[59620]: DEBUG nova.virt.hardware [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 695.269372] env[59620]: DEBUG nova.virt.hardware [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 695.269590] env[59620]: DEBUG nova.virt.hardware [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 695.269758] env[59620]: DEBUG nova.virt.hardware [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 695.269921] env[59620]: DEBUG nova.virt.hardware [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 695.270090] env[59620]: DEBUG nova.virt.hardware [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 695.270248] env[59620]: DEBUG nova.virt.hardware [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 695.271143] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-203db55a-ebf4-402d-80d3-92eaaa88ba8f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.279723] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-182dfe27-e74c-4d57-a320-5f5be9ac55bd {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.440937] env[59620]: DEBUG nova.policy [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'db368a78ac5245d6a869b37de0fc1d2e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61af4231bb9c4fc2a2d742b7c3d1db40', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 695.442970] env[59620]: DEBUG nova.network.neutron [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Successfully created port: 1f720111-e6c1-4177-a050-ae028aa2b25a {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 695.618202] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Acquiring lock "06b62938-99d6-43a1-af87-aced894bc8d8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.618446] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "06b62938-99d6-43a1-af87-aced894bc8d8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.629160] env[59620]: DEBUG nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 695.691457] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.691701] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.694874] env[59620]: INFO nova.compute.claims [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 695.903125] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8e582e6-b08a-47da-9ddd-3dec0618cdb6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.913225] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b50d2766-bca7-4998-97ba-36fdd70c0f2f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.953272] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90813985-7889-497b-85e4-dd9c451a8018 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.961301] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b083680-06c6-47e0-b679-0f190009aac8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.975199] env[59620]: DEBUG nova.compute.provider_tree [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 695.984726] env[59620]: DEBUG nova.scheduler.client.report [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 696.002058] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.310s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.002252] env[59620]: DEBUG nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 696.045015] env[59620]: DEBUG nova.compute.utils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 696.046610] env[59620]: DEBUG nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 696.046779] env[59620]: DEBUG nova.network.neutron [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 696.059916] env[59620]: DEBUG nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 696.124028] env[59620]: INFO nova.virt.block_device [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Booting with volume fc3e203c-1f3f-4ffd-8eb7-15cdd302b995 at /dev/sda [ 696.140336] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Acquiring lock "630acd3e-e4e3-483b-984c-7023fd8c77d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.140738] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "630acd3e-e4e3-483b-984c-7023fd8c77d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.150143] env[59620]: DEBUG nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 696.171787] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a04e015c-82e6-4d1b-8114-6678744bd33d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.179426] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d00b881-95cd-4302-bf44-b5b0274071b1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.206941] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fe7d889c-cb8c-4e7a-84c8-79dccd673fae {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.212832] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.212832] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.214505] env[59620]: INFO nova.compute.claims [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 696.221060] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ccdaa74-fbc0-4b15-818b-a49f314405cb {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.248437] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afba90bd-4f82-4de0-b9ce-b46310c90667 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.260049] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4001c3f2-5f21-4074-876a-2efaefff36f0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.272234] env[59620]: DEBUG nova.virt.block_device [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Updating existing volume attachment record: 1fe774a7-ab8c-4f22-8f2d-8f943554a512 {{(pid=59620) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 696.276066] env[59620]: ERROR nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port ad221b8a-6737-4342-b2e8-d01b33365352, please check neutron logs for more information. [ 696.276066] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 696.276066] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 696.276066] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 696.276066] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 696.276066] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 696.276066] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 696.276066] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 696.276066] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 696.276066] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 696.276066] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 696.276066] env[59620]: ERROR nova.compute.manager raise self.value [ 696.276066] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 696.276066] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 696.276066] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 696.276066] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 696.276625] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 696.276625] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 696.276625] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port ad221b8a-6737-4342-b2e8-d01b33365352, please check neutron logs for more information. [ 696.276625] env[59620]: ERROR nova.compute.manager [ 696.276625] env[59620]: Traceback (most recent call last): [ 696.276625] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 696.276625] env[59620]: listener.cb(fileno) [ 696.276625] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 696.276625] env[59620]: result = function(*args, **kwargs) [ 696.276625] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 696.276625] env[59620]: return func(*args, **kwargs) [ 696.276625] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 696.276625] env[59620]: raise e [ 696.276625] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 696.276625] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 696.276625] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 696.276625] env[59620]: created_port_ids = self._update_ports_for_instance( [ 696.276625] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 696.276625] env[59620]: with excutils.save_and_reraise_exception(): [ 696.276625] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 696.276625] env[59620]: self.force_reraise() [ 696.276625] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 696.276625] env[59620]: raise self.value [ 696.276625] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 696.276625] env[59620]: updated_port = self._update_port( [ 696.276625] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 696.276625] env[59620]: _ensure_no_port_binding_failure(port) [ 696.276625] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 696.276625] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 696.277350] env[59620]: nova.exception.PortBindingFailed: Binding failed for port ad221b8a-6737-4342-b2e8-d01b33365352, please check neutron logs for more information. [ 696.277350] env[59620]: Removing descriptor: 16 [ 696.279550] env[59620]: ERROR nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port ad221b8a-6737-4342-b2e8-d01b33365352, please check neutron logs for more information. [ 696.279550] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Traceback (most recent call last): [ 696.279550] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 696.279550] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] yield resources [ 696.279550] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 696.279550] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] self.driver.spawn(context, instance, image_meta, [ 696.279550] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 696.279550] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] self._vmops.spawn(context, instance, image_meta, injected_files, [ 696.279550] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 696.279550] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] vm_ref = self.build_virtual_machine(instance, [ 696.279550] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 696.279948] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] vif_infos = vmwarevif.get_vif_info(self._session, [ 696.279948] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 696.279948] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] for vif in network_info: [ 696.279948] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 696.279948] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] return self._sync_wrapper(fn, *args, **kwargs) [ 696.279948] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 696.279948] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] self.wait() [ 696.279948] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 696.279948] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] self[:] = self._gt.wait() [ 696.279948] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 696.279948] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] return self._exit_event.wait() [ 696.279948] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 696.279948] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] result = hub.switch() [ 696.279948] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 696.280363] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] return self.greenlet.switch() [ 696.280363] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 696.280363] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] result = function(*args, **kwargs) [ 696.280363] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 696.280363] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] return func(*args, **kwargs) [ 696.280363] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 696.280363] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] raise e [ 696.280363] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 696.280363] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] nwinfo = self.network_api.allocate_for_instance( [ 696.280363] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 696.280363] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] created_port_ids = self._update_ports_for_instance( [ 696.280363] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 696.280363] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] with excutils.save_and_reraise_exception(): [ 696.280717] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 696.280717] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] self.force_reraise() [ 696.280717] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 696.280717] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] raise self.value [ 696.280717] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 696.280717] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] updated_port = self._update_port( [ 696.280717] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 696.280717] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] _ensure_no_port_binding_failure(port) [ 696.280717] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 696.280717] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] raise exception.PortBindingFailed(port_id=port['id']) [ 696.280717] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] nova.exception.PortBindingFailed: Binding failed for port ad221b8a-6737-4342-b2e8-d01b33365352, please check neutron logs for more information. [ 696.280717] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] [ 696.281049] env[59620]: INFO nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Terminating instance [ 696.282822] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Acquiring lock "refresh_cache-763e888f-2290-4ca2-ab8e-703450a21e35" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 696.282973] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Acquired lock "refresh_cache-763e888f-2290-4ca2-ab8e-703450a21e35" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 696.283172] env[59620]: DEBUG nova.network.neutron [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 696.366325] env[59620]: DEBUG nova.network.neutron [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 696.391039] env[59620]: DEBUG nova.policy [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1c7cb7edb3ad470ab8eb7b47954960f9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9d1b4970d4cd420e8fa3c599f48248e3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 696.484381] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70a692e9-ddf2-431f-9f7d-9b3919e0b09f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.494832] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5367cb3-61aa-412b-9ab3-d7bbc6051eaf {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.534924] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b0965d3-88ad-4b88-a2b6-849e761e17d2 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.539700] env[59620]: DEBUG nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 696.539700] env[59620]: DEBUG nova.virt.hardware [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 696.539927] env[59620]: DEBUG nova.virt.hardware [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 696.540093] env[59620]: DEBUG nova.virt.hardware [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 696.540265] env[59620]: DEBUG nova.virt.hardware [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 696.540404] env[59620]: DEBUG nova.virt.hardware [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 696.540541] env[59620]: DEBUG nova.virt.hardware [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 696.540784] env[59620]: DEBUG nova.virt.hardware [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 696.540975] env[59620]: DEBUG nova.virt.hardware [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 696.541199] env[59620]: DEBUG nova.virt.hardware [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 696.541365] env[59620]: DEBUG nova.virt.hardware [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 696.541528] env[59620]: DEBUG nova.virt.hardware [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 696.543028] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff45febe-9ecd-4064-940e-3b89cf537fb2 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.550563] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c095f1c-8cbf-4446-a3e9-c7f1bb22c3af {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.560289] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70e5b8f8-4829-41a3-88b6-46ba9120f736 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.573122] env[59620]: DEBUG nova.compute.provider_tree [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 696.586656] env[59620]: DEBUG nova.scheduler.client.report [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 696.604536] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.391s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.604536] env[59620]: DEBUG nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 696.643289] env[59620]: DEBUG nova.compute.utils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 696.645127] env[59620]: DEBUG nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 696.645307] env[59620]: DEBUG nova.network.neutron [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 696.658442] env[59620]: DEBUG nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 696.732442] env[59620]: DEBUG nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 696.755845] env[59620]: DEBUG nova.virt.hardware [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 696.756103] env[59620]: DEBUG nova.virt.hardware [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 696.756256] env[59620]: DEBUG nova.virt.hardware [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 696.756431] env[59620]: DEBUG nova.virt.hardware [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 696.756570] env[59620]: DEBUG nova.virt.hardware [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 696.756708] env[59620]: DEBUG nova.virt.hardware [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 696.756913] env[59620]: DEBUG nova.virt.hardware [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 696.757077] env[59620]: DEBUG nova.virt.hardware [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 696.757242] env[59620]: DEBUG nova.virt.hardware [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 696.757403] env[59620]: DEBUG nova.virt.hardware [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 696.757613] env[59620]: DEBUG nova.virt.hardware [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 696.758479] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ffac756-cf63-420f-a30b-06bdfa01b74f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.766416] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3619c65f-ad11-4e3f-9ada-794b54390a2a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.937420] env[59620]: DEBUG nova.network.neutron [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 696.948431] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Releasing lock "refresh_cache-763e888f-2290-4ca2-ab8e-703450a21e35" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 696.948845] env[59620]: DEBUG nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 696.949055] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 696.949578] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b0628c44-604d-4194-b297-65351c6c9e9f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.958585] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87dcf8af-ccf0-4b2f-9881-54986143d03b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.981200] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 763e888f-2290-4ca2-ab8e-703450a21e35 could not be found. [ 696.981433] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 696.981612] env[59620]: INFO nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Took 0.03 seconds to destroy the instance on the hypervisor. [ 696.981853] env[59620]: DEBUG oslo.service.loopingcall [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 696.982093] env[59620]: DEBUG nova.compute.manager [-] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 696.982187] env[59620]: DEBUG nova.network.neutron [-] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 697.036971] env[59620]: DEBUG nova.policy [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8af6208ef2254718bec125cc1bc85039', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '29c14b15b48749a9bf540ab72ee49451', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 697.238498] env[59620]: DEBUG nova.network.neutron [-] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 697.256206] env[59620]: DEBUG nova.network.neutron [-] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 697.278587] env[59620]: INFO nova.compute.manager [-] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Took 0.30 seconds to deallocate network for instance. [ 697.281834] env[59620]: DEBUG nova.compute.claims [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 697.282157] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 697.282502] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 697.465064] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d297af8-df5f-4552-ad98-f146321a2ac2 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.473331] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05c60f4c-0788-4f29-9675-31e6242fc1f3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.504154] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba9241f4-275b-4c5d-ba2f-53c65a56299d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.511869] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-552228a3-2876-4342-9fc8-ec7228ad0188 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.525682] env[59620]: DEBUG nova.compute.provider_tree [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 697.535476] env[59620]: DEBUG nova.scheduler.client.report [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 697.549713] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.267s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 697.550369] env[59620]: ERROR nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port ad221b8a-6737-4342-b2e8-d01b33365352, please check neutron logs for more information. [ 697.550369] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Traceback (most recent call last): [ 697.550369] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 697.550369] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] self.driver.spawn(context, instance, image_meta, [ 697.550369] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 697.550369] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] self._vmops.spawn(context, instance, image_meta, injected_files, [ 697.550369] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 697.550369] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] vm_ref = self.build_virtual_machine(instance, [ 697.550369] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 697.550369] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] vif_infos = vmwarevif.get_vif_info(self._session, [ 697.550369] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 697.550656] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] for vif in network_info: [ 697.550656] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 697.550656] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] return self._sync_wrapper(fn, *args, **kwargs) [ 697.550656] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 697.550656] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] self.wait() [ 697.550656] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 697.550656] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] self[:] = self._gt.wait() [ 697.550656] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 697.550656] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] return self._exit_event.wait() [ 697.550656] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 697.550656] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] result = hub.switch() [ 697.550656] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 697.550656] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] return self.greenlet.switch() [ 697.550656] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 697.551019] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] result = function(*args, **kwargs) [ 697.551019] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 697.551019] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] return func(*args, **kwargs) [ 697.551019] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 697.551019] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] raise e [ 697.551019] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 697.551019] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] nwinfo = self.network_api.allocate_for_instance( [ 697.551019] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 697.551019] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] created_port_ids = self._update_ports_for_instance( [ 697.551019] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 697.551019] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] with excutils.save_and_reraise_exception(): [ 697.551019] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 697.551019] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] self.force_reraise() [ 697.551308] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 697.551308] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] raise self.value [ 697.551308] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 697.551308] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] updated_port = self._update_port( [ 697.551308] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 697.551308] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] _ensure_no_port_binding_failure(port) [ 697.551308] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 697.551308] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] raise exception.PortBindingFailed(port_id=port['id']) [ 697.551308] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] nova.exception.PortBindingFailed: Binding failed for port ad221b8a-6737-4342-b2e8-d01b33365352, please check neutron logs for more information. [ 697.551308] env[59620]: ERROR nova.compute.manager [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] [ 697.551567] env[59620]: DEBUG nova.compute.utils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Binding failed for port ad221b8a-6737-4342-b2e8-d01b33365352, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 697.555744] env[59620]: DEBUG nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Build of instance 763e888f-2290-4ca2-ab8e-703450a21e35 was re-scheduled: Binding failed for port ad221b8a-6737-4342-b2e8-d01b33365352, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 697.556201] env[59620]: DEBUG nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 697.556415] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Acquiring lock "refresh_cache-763e888f-2290-4ca2-ab8e-703450a21e35" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 697.556549] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Acquired lock "refresh_cache-763e888f-2290-4ca2-ab8e-703450a21e35" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 697.556699] env[59620]: DEBUG nova.network.neutron [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 697.653700] env[59620]: DEBUG nova.network.neutron [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 697.693164] env[59620]: DEBUG nova.network.neutron [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Successfully created port: 2b350739-2c35-49e6-891a-61da85a49d31 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 698.399494] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "588eb672-6240-46cd-8e93-b38c9e2829bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.400382] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "588eb672-6240-46cd-8e93-b38c9e2829bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.413030] env[59620]: DEBUG nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 698.459119] env[59620]: DEBUG nova.network.neutron [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 698.478626] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Releasing lock "refresh_cache-763e888f-2290-4ca2-ab8e-703450a21e35" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 698.479283] env[59620]: DEBUG nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 698.480227] env[59620]: DEBUG nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 698.480227] env[59620]: DEBUG nova.network.neutron [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 698.483071] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.483302] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.484818] env[59620]: INFO nova.compute.claims [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 698.584632] env[59620]: DEBUG nova.network.neutron [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 698.592392] env[59620]: DEBUG nova.network.neutron [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 698.608470] env[59620]: INFO nova.compute.manager [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] [instance: 763e888f-2290-4ca2-ab8e-703450a21e35] Took 0.13 seconds to deallocate network for instance. [ 698.610023] env[59620]: DEBUG nova.network.neutron [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Successfully created port: efe32174-7153-4005-a87b-4cc066a6d6b7 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 698.703645] env[59620]: INFO nova.scheduler.client.report [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Deleted allocations for instance 763e888f-2290-4ca2-ab8e-703450a21e35 [ 698.725611] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d24d0a70-53a5-4121-af7e-84fbcb309161 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.729536] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4aac40cc-3b82-4890-a31e-2806c25eff4d tempest-InstanceActionsNegativeTestJSON-1192799410 tempest-InstanceActionsNegativeTestJSON-1192799410-project-member] Lock "763e888f-2290-4ca2-ab8e-703450a21e35" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.532s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.736017] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36f7297a-f1f3-4a63-9263-989b2f8d19e9 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.768320] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5068ebc1-9f6b-4196-905e-1b543ee414ec {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.775613] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b93bdfd0-a946-40ed-9635-6412c5092da6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.790461] env[59620]: DEBUG nova.compute.provider_tree [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 698.800661] env[59620]: DEBUG nova.scheduler.client.report [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 698.814357] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.331s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.814924] env[59620]: DEBUG nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 698.865315] env[59620]: DEBUG nova.compute.utils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 698.867420] env[59620]: DEBUG nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 698.867760] env[59620]: DEBUG nova.network.neutron [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 698.877298] env[59620]: DEBUG nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 698.956978] env[59620]: DEBUG nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 698.990183] env[59620]: DEBUG nova.virt.hardware [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 698.990183] env[59620]: DEBUG nova.virt.hardware [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 698.990303] env[59620]: DEBUG nova.virt.hardware [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 698.990874] env[59620]: DEBUG nova.virt.hardware [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 698.990874] env[59620]: DEBUG nova.virt.hardware [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 698.991320] env[59620]: DEBUG nova.virt.hardware [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 698.991649] env[59620]: DEBUG nova.virt.hardware [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 698.991826] env[59620]: DEBUG nova.virt.hardware [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 698.993640] env[59620]: DEBUG nova.virt.hardware [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 698.993640] env[59620]: DEBUG nova.virt.hardware [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 698.993640] env[59620]: DEBUG nova.virt.hardware [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 698.993860] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d670d89a-8dca-464d-a795-ca60b1fb549d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.002312] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db0ee4ef-83a5-46df-8161-6c1e29545c18 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.048879] env[59620]: ERROR nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 4ccfce69-0ece-46b9-9cc5-a7451971a93d, please check neutron logs for more information. [ 699.048879] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 699.048879] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 699.048879] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 699.048879] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 699.048879] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 699.048879] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 699.048879] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 699.048879] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 699.048879] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 699.048879] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 699.048879] env[59620]: ERROR nova.compute.manager raise self.value [ 699.048879] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 699.048879] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 699.048879] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 699.048879] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 699.049319] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 699.049319] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 699.049319] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 4ccfce69-0ece-46b9-9cc5-a7451971a93d, please check neutron logs for more information. [ 699.049319] env[59620]: ERROR nova.compute.manager [ 699.049319] env[59620]: Traceback (most recent call last): [ 699.049319] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 699.049319] env[59620]: listener.cb(fileno) [ 699.049319] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 699.049319] env[59620]: result = function(*args, **kwargs) [ 699.049319] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 699.049319] env[59620]: return func(*args, **kwargs) [ 699.049319] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 699.049319] env[59620]: raise e [ 699.049319] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 699.049319] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 699.049319] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 699.049319] env[59620]: created_port_ids = self._update_ports_for_instance( [ 699.049319] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 699.049319] env[59620]: with excutils.save_and_reraise_exception(): [ 699.049319] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 699.049319] env[59620]: self.force_reraise() [ 699.049319] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 699.049319] env[59620]: raise self.value [ 699.049319] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 699.049319] env[59620]: updated_port = self._update_port( [ 699.049319] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 699.049319] env[59620]: _ensure_no_port_binding_failure(port) [ 699.049319] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 699.049319] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 699.050015] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 4ccfce69-0ece-46b9-9cc5-a7451971a93d, please check neutron logs for more information. [ 699.050015] env[59620]: Removing descriptor: 15 [ 699.050015] env[59620]: ERROR nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 4ccfce69-0ece-46b9-9cc5-a7451971a93d, please check neutron logs for more information. [ 699.050015] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Traceback (most recent call last): [ 699.050015] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 699.050015] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] yield resources [ 699.050015] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 699.050015] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] self.driver.spawn(context, instance, image_meta, [ 699.050015] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 699.050015] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 699.050015] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 699.050015] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] vm_ref = self.build_virtual_machine(instance, [ 699.050309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 699.050309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] vif_infos = vmwarevif.get_vif_info(self._session, [ 699.050309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 699.050309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] for vif in network_info: [ 699.050309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 699.050309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] return self._sync_wrapper(fn, *args, **kwargs) [ 699.050309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 699.050309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] self.wait() [ 699.050309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 699.050309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] self[:] = self._gt.wait() [ 699.050309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 699.050309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] return self._exit_event.wait() [ 699.050309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 699.050309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] result = hub.switch() [ 699.050710] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 699.050710] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] return self.greenlet.switch() [ 699.050710] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 699.050710] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] result = function(*args, **kwargs) [ 699.050710] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 699.050710] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] return func(*args, **kwargs) [ 699.050710] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 699.050710] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] raise e [ 699.050710] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 699.050710] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] nwinfo = self.network_api.allocate_for_instance( [ 699.050710] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 699.050710] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] created_port_ids = self._update_ports_for_instance( [ 699.050710] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 699.051146] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] with excutils.save_and_reraise_exception(): [ 699.051146] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 699.051146] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] self.force_reraise() [ 699.051146] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 699.051146] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] raise self.value [ 699.051146] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 699.051146] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] updated_port = self._update_port( [ 699.051146] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 699.051146] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] _ensure_no_port_binding_failure(port) [ 699.051146] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 699.051146] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] raise exception.PortBindingFailed(port_id=port['id']) [ 699.051146] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] nova.exception.PortBindingFailed: Binding failed for port 4ccfce69-0ece-46b9-9cc5-a7451971a93d, please check neutron logs for more information. [ 699.051146] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] [ 699.051497] env[59620]: INFO nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Terminating instance [ 699.052546] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "refresh_cache-b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 699.052629] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquired lock "refresh_cache-b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 699.053478] env[59620]: DEBUG nova.network.neutron [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 699.061531] env[59620]: DEBUG nova.network.neutron [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Successfully created port: 0eb03d48-4be8-4f39-82d5-41e84fcb3939 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 699.134429] env[59620]: DEBUG nova.network.neutron [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 699.166227] env[59620]: DEBUG nova.policy [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7b24e88854ce4efb81412f81ab12f923', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '035fb02e7d5e4870a9853822e21bff7b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 699.189920] env[59620]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Acquiring lock "3f53c35f-40ea-4094-89e2-624b156e5560" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.190115] env[59620]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "3f53c35f-40ea-4094-89e2-624b156e5560" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.200265] env[59620]: DEBUG nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 699.257078] env[59620]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.257324] env[59620]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.258798] env[59620]: INFO nova.compute.claims [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 699.441026] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71da1ec0-61be-4482-a907-5fdf4b0c784e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.450357] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-833bd5cd-9170-459f-8ef6-7e483ce9f8de {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.492558] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-101cfe3d-f089-465a-92fb-44611baa7a41 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.501758] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7ef6fc5-c29f-4ba8-b952-45ff9c65c520 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.516695] env[59620]: DEBUG nova.compute.provider_tree [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 699.524687] env[59620]: DEBUG nova.scheduler.client.report [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 699.540197] env[59620]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.282s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 699.540197] env[59620]: DEBUG nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 699.576786] env[59620]: DEBUG nova.compute.utils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 699.581255] env[59620]: DEBUG nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 699.581391] env[59620]: DEBUG nova.network.neutron [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 699.588205] env[59620]: DEBUG nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 699.663142] env[59620]: DEBUG nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 699.686781] env[59620]: DEBUG nova.virt.hardware [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 699.687047] env[59620]: DEBUG nova.virt.hardware [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 699.687455] env[59620]: DEBUG nova.virt.hardware [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 699.687652] env[59620]: DEBUG nova.virt.hardware [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 699.687808] env[59620]: DEBUG nova.virt.hardware [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 699.687965] env[59620]: DEBUG nova.virt.hardware [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 699.688140] env[59620]: DEBUG nova.virt.hardware [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 699.688296] env[59620]: DEBUG nova.virt.hardware [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 699.688458] env[59620]: DEBUG nova.virt.hardware [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 699.688989] env[59620]: DEBUG nova.virt.hardware [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 699.688989] env[59620]: DEBUG nova.virt.hardware [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 699.693061] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c418c084-7e19-467c-9e44-52d8ca86dcc8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.699056] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be39fb14-45f3-4bbc-9b46-48555623072e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.888527] env[59620]: DEBUG nova.policy [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '47a15abf5130436b9bb35961355b2452', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '734b5ead550e4b53b20700d2f870a662', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 699.899972] env[59620]: DEBUG nova.network.neutron [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 699.917947] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Releasing lock "refresh_cache-b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 699.917947] env[59620]: DEBUG nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 699.917947] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 699.917947] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ead0be33-4455-48af-8fa8-85bc50375598 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.930257] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e07fd24-eb9f-4b0c-b37e-d59ee09831f8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 699.959629] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0 could not be found. [ 699.959985] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 699.960218] env[59620]: INFO nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 699.960561] env[59620]: DEBUG oslo.service.loopingcall [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 699.960887] env[59620]: DEBUG nova.compute.manager [-] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 699.961048] env[59620]: DEBUG nova.network.neutron [-] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 700.033481] env[59620]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Acquiring lock "e89d07fc-9c98-4352-b609-c7fde7ee0d39" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.033705] env[59620]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "e89d07fc-9c98-4352-b609-c7fde7ee0d39" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.036209] env[59620]: DEBUG nova.network.neutron [-] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 700.056737] env[59620]: DEBUG nova.network.neutron [-] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 700.071080] env[59620]: INFO nova.compute.manager [-] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Took 0.11 seconds to deallocate network for instance. [ 700.073508] env[59620]: DEBUG nova.compute.claims [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 700.073711] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.073931] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.309823] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b179ed97-c9ef-45ce-a54f-f5e6d660f00d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.318561] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e23aa09-f849-474b-987a-81126a3a20dd {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.354468] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcfbc5af-359a-4fb2-ac31-7a740c7f890f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.362187] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80e8714b-bc2e-4334-b6e2-42b1bc47e6df {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 700.377930] env[59620]: DEBUG nova.compute.provider_tree [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 700.386853] env[59620]: DEBUG nova.scheduler.client.report [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 700.405751] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.332s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.406321] env[59620]: ERROR nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 4ccfce69-0ece-46b9-9cc5-a7451971a93d, please check neutron logs for more information. [ 700.406321] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Traceback (most recent call last): [ 700.406321] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 700.406321] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] self.driver.spawn(context, instance, image_meta, [ 700.406321] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 700.406321] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 700.406321] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 700.406321] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] vm_ref = self.build_virtual_machine(instance, [ 700.406321] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 700.406321] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] vif_infos = vmwarevif.get_vif_info(self._session, [ 700.406321] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 700.406677] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] for vif in network_info: [ 700.406677] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 700.406677] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] return self._sync_wrapper(fn, *args, **kwargs) [ 700.406677] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 700.406677] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] self.wait() [ 700.406677] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 700.406677] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] self[:] = self._gt.wait() [ 700.406677] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 700.406677] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] return self._exit_event.wait() [ 700.406677] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 700.406677] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] result = hub.switch() [ 700.406677] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 700.406677] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] return self.greenlet.switch() [ 700.406677] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 700.408309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] result = function(*args, **kwargs) [ 700.408309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 700.408309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] return func(*args, **kwargs) [ 700.408309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 700.408309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] raise e [ 700.408309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 700.408309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] nwinfo = self.network_api.allocate_for_instance( [ 700.408309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 700.408309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] created_port_ids = self._update_ports_for_instance( [ 700.408309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 700.408309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] with excutils.save_and_reraise_exception(): [ 700.408309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 700.408309] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] self.force_reraise() [ 700.409027] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 700.409027] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] raise self.value [ 700.409027] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 700.409027] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] updated_port = self._update_port( [ 700.409027] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 700.409027] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] _ensure_no_port_binding_failure(port) [ 700.409027] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 700.409027] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] raise exception.PortBindingFailed(port_id=port['id']) [ 700.409027] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] nova.exception.PortBindingFailed: Binding failed for port 4ccfce69-0ece-46b9-9cc5-a7451971a93d, please check neutron logs for more information. [ 700.409027] env[59620]: ERROR nova.compute.manager [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] [ 700.409027] env[59620]: DEBUG nova.compute.utils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Binding failed for port 4ccfce69-0ece-46b9-9cc5-a7451971a93d, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 700.409383] env[59620]: DEBUG nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Build of instance b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0 was re-scheduled: Binding failed for port 4ccfce69-0ece-46b9-9cc5-a7451971a93d, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 700.409383] env[59620]: DEBUG nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 700.409383] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "refresh_cache-b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 700.409383] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquired lock "refresh_cache-b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 700.409517] env[59620]: DEBUG nova.network.neutron [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 700.496036] env[59620]: DEBUG nova.network.neutron [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 701.049620] env[59620]: ERROR nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 36365480-2ece-4649-a6c2-b796389d5a15, please check neutron logs for more information. [ 701.049620] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 701.049620] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 701.049620] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 701.049620] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 701.049620] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 701.049620] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 701.049620] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 701.049620] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 701.049620] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 701.049620] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 701.049620] env[59620]: ERROR nova.compute.manager raise self.value [ 701.049620] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 701.049620] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 701.049620] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 701.049620] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 701.050194] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 701.050194] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 701.050194] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 36365480-2ece-4649-a6c2-b796389d5a15, please check neutron logs for more information. [ 701.050194] env[59620]: ERROR nova.compute.manager [ 701.050194] env[59620]: Traceback (most recent call last): [ 701.050194] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 701.050194] env[59620]: listener.cb(fileno) [ 701.050194] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 701.050194] env[59620]: result = function(*args, **kwargs) [ 701.050194] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 701.050194] env[59620]: return func(*args, **kwargs) [ 701.050194] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 701.050194] env[59620]: raise e [ 701.050194] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 701.050194] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 701.050194] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 701.050194] env[59620]: created_port_ids = self._update_ports_for_instance( [ 701.050194] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 701.050194] env[59620]: with excutils.save_and_reraise_exception(): [ 701.050194] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 701.050194] env[59620]: self.force_reraise() [ 701.050194] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 701.050194] env[59620]: raise self.value [ 701.050194] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 701.050194] env[59620]: updated_port = self._update_port( [ 701.050194] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 701.050194] env[59620]: _ensure_no_port_binding_failure(port) [ 701.050194] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 701.050194] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 701.051046] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 36365480-2ece-4649-a6c2-b796389d5a15, please check neutron logs for more information. [ 701.051046] env[59620]: Removing descriptor: 11 [ 701.051046] env[59620]: ERROR nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 36365480-2ece-4649-a6c2-b796389d5a15, please check neutron logs for more information. [ 701.051046] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] Traceback (most recent call last): [ 701.051046] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 701.051046] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] yield resources [ 701.051046] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 701.051046] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] self.driver.spawn(context, instance, image_meta, [ 701.051046] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 701.051046] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] self._vmops.spawn(context, instance, image_meta, injected_files, [ 701.051046] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 701.051046] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] vm_ref = self.build_virtual_machine(instance, [ 701.051436] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 701.051436] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] vif_infos = vmwarevif.get_vif_info(self._session, [ 701.051436] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 701.051436] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] for vif in network_info: [ 701.051436] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 701.051436] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] return self._sync_wrapper(fn, *args, **kwargs) [ 701.051436] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 701.051436] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] self.wait() [ 701.051436] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 701.051436] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] self[:] = self._gt.wait() [ 701.051436] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 701.051436] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] return self._exit_event.wait() [ 701.051436] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 701.051436] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] result = hub.switch() [ 701.051768] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 701.051768] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] return self.greenlet.switch() [ 701.051768] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 701.051768] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] result = function(*args, **kwargs) [ 701.051768] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 701.051768] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] return func(*args, **kwargs) [ 701.051768] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 701.051768] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] raise e [ 701.051768] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 701.051768] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] nwinfo = self.network_api.allocate_for_instance( [ 701.051768] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 701.051768] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] created_port_ids = self._update_ports_for_instance( [ 701.051768] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 701.052083] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] with excutils.save_and_reraise_exception(): [ 701.052083] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 701.052083] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] self.force_reraise() [ 701.052083] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 701.052083] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] raise self.value [ 701.052083] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 701.052083] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] updated_port = self._update_port( [ 701.052083] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 701.052083] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] _ensure_no_port_binding_failure(port) [ 701.052083] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 701.052083] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] raise exception.PortBindingFailed(port_id=port['id']) [ 701.052083] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] nova.exception.PortBindingFailed: Binding failed for port 36365480-2ece-4649-a6c2-b796389d5a15, please check neutron logs for more information. [ 701.052083] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] [ 701.052391] env[59620]: INFO nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Terminating instance [ 701.053502] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "refresh_cache-9807a449-4cca-416c-815d-99d5bc674464" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 701.053675] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquired lock "refresh_cache-9807a449-4cca-416c-815d-99d5bc674464" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 701.053839] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 701.137117] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 701.195945] env[59620]: ERROR nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 8fceeed4-872d-49b7-ad0b-e79723805903, please check neutron logs for more information. [ 701.195945] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 701.195945] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 701.195945] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 701.195945] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 701.195945] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 701.195945] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 701.195945] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 701.195945] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 701.195945] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 701.195945] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 701.195945] env[59620]: ERROR nova.compute.manager raise self.value [ 701.195945] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 701.195945] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 701.195945] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 701.195945] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 701.196482] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 701.196482] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 701.196482] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 8fceeed4-872d-49b7-ad0b-e79723805903, please check neutron logs for more information. [ 701.196482] env[59620]: ERROR nova.compute.manager [ 701.196482] env[59620]: Traceback (most recent call last): [ 701.196482] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 701.196482] env[59620]: listener.cb(fileno) [ 701.196482] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 701.196482] env[59620]: result = function(*args, **kwargs) [ 701.196482] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 701.196482] env[59620]: return func(*args, **kwargs) [ 701.196482] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 701.196482] env[59620]: raise e [ 701.196482] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 701.196482] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 701.196482] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 701.196482] env[59620]: created_port_ids = self._update_ports_for_instance( [ 701.196482] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 701.196482] env[59620]: with excutils.save_and_reraise_exception(): [ 701.196482] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 701.196482] env[59620]: self.force_reraise() [ 701.196482] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 701.196482] env[59620]: raise self.value [ 701.196482] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 701.196482] env[59620]: updated_port = self._update_port( [ 701.196482] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 701.196482] env[59620]: _ensure_no_port_binding_failure(port) [ 701.196482] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 701.196482] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 701.197383] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 8fceeed4-872d-49b7-ad0b-e79723805903, please check neutron logs for more information. [ 701.197383] env[59620]: Removing descriptor: 19 [ 701.197383] env[59620]: ERROR nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 8fceeed4-872d-49b7-ad0b-e79723805903, please check neutron logs for more information. [ 701.197383] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Traceback (most recent call last): [ 701.197383] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 701.197383] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] yield resources [ 701.197383] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 701.197383] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] self.driver.spawn(context, instance, image_meta, [ 701.197383] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 701.197383] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 701.197383] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 701.197383] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] vm_ref = self.build_virtual_machine(instance, [ 701.197712] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 701.197712] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] vif_infos = vmwarevif.get_vif_info(self._session, [ 701.197712] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 701.197712] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] for vif in network_info: [ 701.197712] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 701.197712] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] return self._sync_wrapper(fn, *args, **kwargs) [ 701.197712] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 701.197712] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] self.wait() [ 701.197712] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 701.197712] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] self[:] = self._gt.wait() [ 701.197712] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 701.197712] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] return self._exit_event.wait() [ 701.197712] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 701.197712] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] result = hub.switch() [ 701.198073] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 701.198073] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] return self.greenlet.switch() [ 701.198073] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 701.198073] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] result = function(*args, **kwargs) [ 701.198073] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 701.198073] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] return func(*args, **kwargs) [ 701.198073] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 701.198073] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] raise e [ 701.198073] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 701.198073] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] nwinfo = self.network_api.allocate_for_instance( [ 701.198073] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 701.198073] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] created_port_ids = self._update_ports_for_instance( [ 701.198073] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 701.198456] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] with excutils.save_and_reraise_exception(): [ 701.198456] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 701.198456] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] self.force_reraise() [ 701.198456] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 701.198456] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] raise self.value [ 701.198456] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 701.198456] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] updated_port = self._update_port( [ 701.198456] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 701.198456] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] _ensure_no_port_binding_failure(port) [ 701.198456] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 701.198456] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] raise exception.PortBindingFailed(port_id=port['id']) [ 701.198456] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] nova.exception.PortBindingFailed: Binding failed for port 8fceeed4-872d-49b7-ad0b-e79723805903, please check neutron logs for more information. [ 701.198456] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] [ 701.198833] env[59620]: INFO nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Terminating instance [ 701.208219] env[59620]: DEBUG nova.network.neutron [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 701.211158] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "refresh_cache-8aec9e05-7685-4895-b375-6f5cd45e7a5f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 701.211376] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquired lock "refresh_cache-8aec9e05-7685-4895-b375-6f5cd45e7a5f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 701.211542] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 701.228663] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Releasing lock "refresh_cache-b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 701.229350] env[59620]: DEBUG nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 701.229603] env[59620]: DEBUG nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 701.229870] env[59620]: DEBUG nova.network.neutron [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 701.253793] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Acquiring lock "e31d29d1-c49c-4696-85c9-11cb985a7bfd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.253793] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "e31d29d1-c49c-4696-85c9-11cb985a7bfd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.306182] env[59620]: DEBUG nova.network.neutron [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 701.308453] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 701.315239] env[59620]: DEBUG nova.network.neutron [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 701.327207] env[59620]: INFO nova.compute.manager [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0] Took 0.10 seconds to deallocate network for instance. [ 701.432934] env[59620]: DEBUG nova.network.neutron [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Successfully created port: bc0d8fad-86d1-4cf5-9df4-6db7cc8f5e4d {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 701.435823] env[59620]: INFO nova.scheduler.client.report [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Deleted allocations for instance b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0 [ 701.461089] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2da55564-4210-41d2-8253-83e4311b7f82 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "b310c1fd-3a8b-49d7-8990-a9e7cc6ba5d0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.091s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 701.478646] env[59620]: DEBUG nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 701.530310] env[59620]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.530566] env[59620]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 701.532166] env[59620]: INFO nova.compute.claims [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 701.755676] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f09ada5a-c956-4232-8726-7044cf33cf30 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.765150] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f15efbfb-3ed5-4b18-8a9a-79a3a11c54f3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.804899] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee6ca567-95ec-4289-8755-af9e5d4d0ddd {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.813426] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf0dadc9-ff2d-4231-9602-9e5e427a2b60 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.827358] env[59620]: DEBUG nova.compute.provider_tree [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 701.839826] env[59620]: DEBUG nova.scheduler.client.report [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 701.861459] env[59620]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.331s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 701.861951] env[59620]: DEBUG nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 701.902788] env[59620]: DEBUG nova.compute.utils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 701.904202] env[59620]: DEBUG nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 701.904358] env[59620]: DEBUG nova.network.neutron [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 701.914218] env[59620]: DEBUG nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 701.957779] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 701.968468] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Releasing lock "refresh_cache-9807a449-4cca-416c-815d-99d5bc674464" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 701.968868] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 701.969094] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 701.969592] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1504ac72-2405-410b-a278-c477ef2b3934 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 701.980123] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f7eaf42-d7a4-41c5-8821-0a6f85f1d281 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.000215] env[59620]: DEBUG nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 702.009954] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9807a449-4cca-416c-815d-99d5bc674464 could not be found. [ 702.009954] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 702.010106] env[59620]: INFO nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Took 0.04 seconds to destroy the instance on the hypervisor. [ 702.010337] env[59620]: DEBUG oslo.service.loopingcall [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 702.010538] env[59620]: DEBUG nova.compute.manager [-] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 702.010667] env[59620]: DEBUG nova.network.neutron [-] [instance: 9807a449-4cca-416c-815d-99d5bc674464] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 702.031056] env[59620]: DEBUG nova.virt.hardware [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 702.031314] env[59620]: DEBUG nova.virt.hardware [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 702.031502] env[59620]: DEBUG nova.virt.hardware [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 702.031660] env[59620]: DEBUG nova.virt.hardware [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 702.032743] env[59620]: DEBUG nova.virt.hardware [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 702.032743] env[59620]: DEBUG nova.virt.hardware [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 702.032743] env[59620]: DEBUG nova.virt.hardware [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 702.032743] env[59620]: DEBUG nova.virt.hardware [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 702.032743] env[59620]: DEBUG nova.virt.hardware [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 702.032979] env[59620]: DEBUG nova.virt.hardware [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 702.032979] env[59620]: DEBUG nova.virt.hardware [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 702.033830] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59a2ecf6-b5e8-4867-a379-724823bda9cf {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.041824] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c2d2a37-e5c3-4c7f-8020-82384633ee8b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.086583] env[59620]: DEBUG nova.network.neutron [-] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 702.103747] env[59620]: DEBUG nova.network.neutron [-] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.107057] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.109475] env[59620]: DEBUG nova.network.neutron [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Successfully created port: 89928e6b-9db2-4a6a-8579-49fb7f3d9e4f {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 702.127787] env[59620]: INFO nova.compute.manager [-] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Took 0.11 seconds to deallocate network for instance. [ 702.127787] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Releasing lock "refresh_cache-8aec9e05-7685-4895-b375-6f5cd45e7a5f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 702.127787] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 702.127787] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 702.127787] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ea939583-644a-4371-8634-ddfd90c4e6a1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.131598] env[59620]: DEBUG nova.compute.claims [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 702.131598] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.131598] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.138266] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea6b2f6b-1f14-492f-8a06-bed7256c5a26 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.161642] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8aec9e05-7685-4895-b375-6f5cd45e7a5f could not be found. [ 702.161898] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 702.162107] env[59620]: INFO nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 702.162349] env[59620]: DEBUG oslo.service.loopingcall [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 702.162565] env[59620]: DEBUG nova.compute.manager [-] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 702.162656] env[59620]: DEBUG nova.network.neutron [-] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 702.255666] env[59620]: DEBUG nova.network.neutron [-] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 702.265324] env[59620]: DEBUG nova.network.neutron [-] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.276216] env[59620]: INFO nova.compute.manager [-] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Took 0.11 seconds to deallocate network for instance. [ 702.278156] env[59620]: DEBUG nova.compute.claims [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 702.278324] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.349922] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-355e51ee-5979-4a2b-93bd-1045ad30fff5 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.357828] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7840340b-804a-47a2-b0ad-6576f6cd817f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.388981] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7bf39b8-9160-4da2-a33b-57ce82154b9b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.396887] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a7fe6ff-ddd5-4064-a5ca-d26c5f8a881e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.410613] env[59620]: DEBUG nova.compute.provider_tree [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 702.426185] env[59620]: DEBUG nova.scheduler.client.report [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 702.433859] env[59620]: DEBUG nova.policy [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6f9c6ae878974ba680acf8fae4e0f078', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '296603b1512a48abb6f0d69cb85ac9da', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 702.439977] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.310s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 702.440603] env[59620]: ERROR nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 36365480-2ece-4649-a6c2-b796389d5a15, please check neutron logs for more information. [ 702.440603] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] Traceback (most recent call last): [ 702.440603] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 702.440603] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] self.driver.spawn(context, instance, image_meta, [ 702.440603] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 702.440603] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] self._vmops.spawn(context, instance, image_meta, injected_files, [ 702.440603] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 702.440603] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] vm_ref = self.build_virtual_machine(instance, [ 702.440603] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 702.440603] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] vif_infos = vmwarevif.get_vif_info(self._session, [ 702.440603] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 702.441126] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] for vif in network_info: [ 702.441126] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 702.441126] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] return self._sync_wrapper(fn, *args, **kwargs) [ 702.441126] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 702.441126] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] self.wait() [ 702.441126] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 702.441126] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] self[:] = self._gt.wait() [ 702.441126] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 702.441126] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] return self._exit_event.wait() [ 702.441126] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 702.441126] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] result = hub.switch() [ 702.441126] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 702.441126] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] return self.greenlet.switch() [ 702.441126] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 702.441694] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] result = function(*args, **kwargs) [ 702.441694] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 702.441694] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] return func(*args, **kwargs) [ 702.441694] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 702.441694] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] raise e [ 702.441694] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 702.441694] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] nwinfo = self.network_api.allocate_for_instance( [ 702.441694] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 702.441694] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] created_port_ids = self._update_ports_for_instance( [ 702.441694] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 702.441694] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] with excutils.save_and_reraise_exception(): [ 702.441694] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 702.441694] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] self.force_reraise() [ 702.442564] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 702.442564] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] raise self.value [ 702.442564] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 702.442564] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] updated_port = self._update_port( [ 702.442564] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 702.442564] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] _ensure_no_port_binding_failure(port) [ 702.442564] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 702.442564] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] raise exception.PortBindingFailed(port_id=port['id']) [ 702.442564] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] nova.exception.PortBindingFailed: Binding failed for port 36365480-2ece-4649-a6c2-b796389d5a15, please check neutron logs for more information. [ 702.442564] env[59620]: ERROR nova.compute.manager [instance: 9807a449-4cca-416c-815d-99d5bc674464] [ 702.442564] env[59620]: DEBUG nova.compute.utils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Binding failed for port 36365480-2ece-4649-a6c2-b796389d5a15, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 702.443022] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.164s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.444977] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Build of instance 9807a449-4cca-416c-815d-99d5bc674464 was re-scheduled: Binding failed for port 36365480-2ece-4649-a6c2-b796389d5a15, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 702.445426] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 702.445639] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "refresh_cache-9807a449-4cca-416c-815d-99d5bc674464" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 702.445778] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquired lock "refresh_cache-9807a449-4cca-416c-815d-99d5bc674464" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 702.445931] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 702.548356] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 702.655967] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3c8e788-a3ae-4a2e-a3ba-00243854a076 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.664114] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30ac83e9-a39d-437d-ad76-5f72d0f7ffd2 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.694605] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a08cc382-bb4e-4cbe-ab3d-179b76773ecb {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.701980] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7359cec2-ac47-4f1b-8f31-bad895a13711 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 702.717074] env[59620]: DEBUG nova.compute.provider_tree [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 702.726594] env[59620]: DEBUG nova.scheduler.client.report [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 702.744637] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.302s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 702.745250] env[59620]: ERROR nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 8fceeed4-872d-49b7-ad0b-e79723805903, please check neutron logs for more information. [ 702.745250] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Traceback (most recent call last): [ 702.745250] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 702.745250] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] self.driver.spawn(context, instance, image_meta, [ 702.745250] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 702.745250] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 702.745250] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 702.745250] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] vm_ref = self.build_virtual_machine(instance, [ 702.745250] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 702.745250] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] vif_infos = vmwarevif.get_vif_info(self._session, [ 702.745250] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 702.745609] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] for vif in network_info: [ 702.745609] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 702.745609] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] return self._sync_wrapper(fn, *args, **kwargs) [ 702.745609] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 702.745609] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] self.wait() [ 702.745609] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 702.745609] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] self[:] = self._gt.wait() [ 702.745609] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 702.745609] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] return self._exit_event.wait() [ 702.745609] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 702.745609] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] result = hub.switch() [ 702.745609] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 702.745609] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] return self.greenlet.switch() [ 702.745609] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 702.745990] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] result = function(*args, **kwargs) [ 702.745990] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 702.745990] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] return func(*args, **kwargs) [ 702.745990] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 702.745990] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] raise e [ 702.745990] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 702.745990] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] nwinfo = self.network_api.allocate_for_instance( [ 702.745990] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 702.745990] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] created_port_ids = self._update_ports_for_instance( [ 702.745990] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 702.745990] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] with excutils.save_and_reraise_exception(): [ 702.745990] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 702.745990] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] self.force_reraise() [ 702.746426] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 702.746426] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] raise self.value [ 702.746426] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 702.746426] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] updated_port = self._update_port( [ 702.746426] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 702.746426] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] _ensure_no_port_binding_failure(port) [ 702.746426] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 702.746426] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] raise exception.PortBindingFailed(port_id=port['id']) [ 702.746426] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] nova.exception.PortBindingFailed: Binding failed for port 8fceeed4-872d-49b7-ad0b-e79723805903, please check neutron logs for more information. [ 702.746426] env[59620]: ERROR nova.compute.manager [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] [ 702.746426] env[59620]: DEBUG nova.compute.utils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Binding failed for port 8fceeed4-872d-49b7-ad0b-e79723805903, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 702.747358] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Build of instance 8aec9e05-7685-4895-b375-6f5cd45e7a5f was re-scheduled: Binding failed for port 8fceeed4-872d-49b7-ad0b-e79723805903, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 702.747787] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 702.748008] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "refresh_cache-8aec9e05-7685-4895-b375-6f5cd45e7a5f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 702.748154] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquired lock "refresh_cache-8aec9e05-7685-4895-b375-6f5cd45e7a5f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 702.748303] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 702.816990] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 703.081746] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "6fdadbc2-14e5-440f-aba2-4db693f56de6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.081978] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "6fdadbc2-14e5-440f-aba2-4db693f56de6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 703.315407] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 703.326111] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Releasing lock "refresh_cache-9807a449-4cca-416c-815d-99d5bc674464" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 703.326352] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 703.326569] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 703.326751] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 703.371010] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 703.385366] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Releasing lock "refresh_cache-8aec9e05-7685-4895-b375-6f5cd45e7a5f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 703.385592] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 703.385766] env[59620]: DEBUG nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 703.385921] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 703.408408] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 703.420984] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 703.425508] env[59620]: ERROR nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 1f720111-e6c1-4177-a050-ae028aa2b25a, please check neutron logs for more information. [ 703.425508] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 703.425508] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 703.425508] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 703.425508] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 703.425508] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 703.425508] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 703.425508] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 703.425508] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 703.425508] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 703.425508] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 703.425508] env[59620]: ERROR nova.compute.manager raise self.value [ 703.425508] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 703.425508] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 703.425508] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 703.425508] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 703.426030] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 703.426030] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 703.426030] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 1f720111-e6c1-4177-a050-ae028aa2b25a, please check neutron logs for more information. [ 703.426030] env[59620]: ERROR nova.compute.manager [ 703.426030] env[59620]: Traceback (most recent call last): [ 703.426030] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 703.426030] env[59620]: listener.cb(fileno) [ 703.426030] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 703.426030] env[59620]: result = function(*args, **kwargs) [ 703.426030] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 703.426030] env[59620]: return func(*args, **kwargs) [ 703.426030] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 703.426030] env[59620]: raise e [ 703.426030] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 703.426030] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 703.426030] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 703.426030] env[59620]: created_port_ids = self._update_ports_for_instance( [ 703.426030] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 703.426030] env[59620]: with excutils.save_and_reraise_exception(): [ 703.426030] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 703.426030] env[59620]: self.force_reraise() [ 703.426030] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 703.426030] env[59620]: raise self.value [ 703.426030] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 703.426030] env[59620]: updated_port = self._update_port( [ 703.426030] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 703.426030] env[59620]: _ensure_no_port_binding_failure(port) [ 703.426030] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 703.426030] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 703.426704] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 1f720111-e6c1-4177-a050-ae028aa2b25a, please check neutron logs for more information. [ 703.426704] env[59620]: Removing descriptor: 14 [ 703.426704] env[59620]: ERROR nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 1f720111-e6c1-4177-a050-ae028aa2b25a, please check neutron logs for more information. [ 703.426704] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Traceback (most recent call last): [ 703.426704] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 703.426704] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] yield resources [ 703.426704] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 703.426704] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] self.driver.spawn(context, instance, image_meta, [ 703.426704] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 703.426704] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 703.426704] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 703.426704] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] vm_ref = self.build_virtual_machine(instance, [ 703.427036] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 703.427036] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] vif_infos = vmwarevif.get_vif_info(self._session, [ 703.427036] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 703.427036] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] for vif in network_info: [ 703.427036] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 703.427036] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] return self._sync_wrapper(fn, *args, **kwargs) [ 703.427036] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 703.427036] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] self.wait() [ 703.427036] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 703.427036] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] self[:] = self._gt.wait() [ 703.427036] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 703.427036] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] return self._exit_event.wait() [ 703.427036] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 703.427036] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] result = hub.switch() [ 703.427362] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 703.427362] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] return self.greenlet.switch() [ 703.427362] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 703.427362] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] result = function(*args, **kwargs) [ 703.427362] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 703.427362] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] return func(*args, **kwargs) [ 703.427362] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 703.427362] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] raise e [ 703.427362] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 703.427362] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] nwinfo = self.network_api.allocate_for_instance( [ 703.427362] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 703.427362] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] created_port_ids = self._update_ports_for_instance( [ 703.427362] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 703.427670] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] with excutils.save_and_reraise_exception(): [ 703.427670] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 703.427670] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] self.force_reraise() [ 703.427670] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 703.427670] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] raise self.value [ 703.427670] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 703.427670] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] updated_port = self._update_port( [ 703.427670] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 703.427670] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] _ensure_no_port_binding_failure(port) [ 703.427670] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 703.427670] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] raise exception.PortBindingFailed(port_id=port['id']) [ 703.427670] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] nova.exception.PortBindingFailed: Binding failed for port 1f720111-e6c1-4177-a050-ae028aa2b25a, please check neutron logs for more information. [ 703.427670] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] [ 703.427983] env[59620]: INFO nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Terminating instance [ 703.430979] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Acquiring lock "refresh_cache-1015f7da-bc69-489b-bb38-b31c1fe919a8" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 703.430979] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Acquired lock "refresh_cache-1015f7da-bc69-489b-bb38-b31c1fe919a8" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 703.430979] env[59620]: DEBUG nova.network.neutron [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 703.436377] env[59620]: INFO nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9807a449-4cca-416c-815d-99d5bc674464] Took 0.11 seconds to deallocate network for instance. [ 703.466234] env[59620]: ERROR nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port c6a56c1d-139f-43f1-9a39-a337feab33b5, please check neutron logs for more information. [ 703.466234] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 703.466234] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 703.466234] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 703.466234] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 703.466234] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 703.466234] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 703.466234] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 703.466234] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 703.466234] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 703.466234] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 703.466234] env[59620]: ERROR nova.compute.manager raise self.value [ 703.466234] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 703.466234] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 703.466234] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 703.466234] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 703.466738] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 703.466738] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 703.466738] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port c6a56c1d-139f-43f1-9a39-a337feab33b5, please check neutron logs for more information. [ 703.466738] env[59620]: ERROR nova.compute.manager [ 703.466738] env[59620]: Traceback (most recent call last): [ 703.466738] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 703.466738] env[59620]: listener.cb(fileno) [ 703.466738] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 703.466738] env[59620]: result = function(*args, **kwargs) [ 703.466738] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 703.466738] env[59620]: return func(*args, **kwargs) [ 703.466738] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 703.466738] env[59620]: raise e [ 703.466738] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 703.466738] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 703.466738] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 703.466738] env[59620]: created_port_ids = self._update_ports_for_instance( [ 703.466738] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 703.466738] env[59620]: with excutils.save_and_reraise_exception(): [ 703.466738] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 703.466738] env[59620]: self.force_reraise() [ 703.466738] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 703.466738] env[59620]: raise self.value [ 703.466738] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 703.466738] env[59620]: updated_port = self._update_port( [ 703.466738] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 703.466738] env[59620]: _ensure_no_port_binding_failure(port) [ 703.466738] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 703.466738] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 703.467442] env[59620]: nova.exception.PortBindingFailed: Binding failed for port c6a56c1d-139f-43f1-9a39-a337feab33b5, please check neutron logs for more information. [ 703.467442] env[59620]: Removing descriptor: 18 [ 703.467442] env[59620]: ERROR nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port c6a56c1d-139f-43f1-9a39-a337feab33b5, please check neutron logs for more information. [ 703.467442] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Traceback (most recent call last): [ 703.467442] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 703.467442] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] yield resources [ 703.467442] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 703.467442] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] self.driver.spawn(context, instance, image_meta, [ 703.467442] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 703.467442] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] self._vmops.spawn(context, instance, image_meta, injected_files, [ 703.467442] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 703.467442] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] vm_ref = self.build_virtual_machine(instance, [ 703.467813] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 703.467813] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] vif_infos = vmwarevif.get_vif_info(self._session, [ 703.467813] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 703.467813] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] for vif in network_info: [ 703.467813] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 703.467813] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] return self._sync_wrapper(fn, *args, **kwargs) [ 703.467813] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 703.467813] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] self.wait() [ 703.467813] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 703.467813] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] self[:] = self._gt.wait() [ 703.467813] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 703.467813] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] return self._exit_event.wait() [ 703.467813] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 703.467813] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] result = hub.switch() [ 703.468194] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 703.468194] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] return self.greenlet.switch() [ 703.468194] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 703.468194] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] result = function(*args, **kwargs) [ 703.468194] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 703.468194] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] return func(*args, **kwargs) [ 703.468194] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 703.468194] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] raise e [ 703.468194] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 703.468194] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] nwinfo = self.network_api.allocate_for_instance( [ 703.468194] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 703.468194] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] created_port_ids = self._update_ports_for_instance( [ 703.468194] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 703.468529] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] with excutils.save_and_reraise_exception(): [ 703.468529] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 703.468529] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] self.force_reraise() [ 703.468529] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 703.468529] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] raise self.value [ 703.468529] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 703.468529] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] updated_port = self._update_port( [ 703.468529] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 703.468529] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] _ensure_no_port_binding_failure(port) [ 703.468529] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 703.468529] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] raise exception.PortBindingFailed(port_id=port['id']) [ 703.468529] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] nova.exception.PortBindingFailed: Binding failed for port c6a56c1d-139f-43f1-9a39-a337feab33b5, please check neutron logs for more information. [ 703.468529] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] [ 703.470716] env[59620]: INFO nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Terminating instance [ 703.470716] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Acquiring lock "refresh_cache-3157e7e4-fe8e-42b6-891b-ae0333b25f33" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 703.470897] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Acquired lock "refresh_cache-3157e7e4-fe8e-42b6-891b-ae0333b25f33" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 703.470970] env[59620]: DEBUG nova.network.neutron [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 703.486534] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 703.496595] env[59620]: DEBUG nova.network.neutron [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 703.509939] env[59620]: INFO nova.compute.manager [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 8aec9e05-7685-4895-b375-6f5cd45e7a5f] Took 0.12 seconds to deallocate network for instance. [ 703.521153] env[59620]: DEBUG nova.network.neutron [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 703.544120] env[59620]: INFO nova.scheduler.client.report [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Deleted allocations for instance 9807a449-4cca-416c-815d-99d5bc674464 [ 703.565929] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "9807a449-4cca-416c-815d-99d5bc674464" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.123s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 703.581810] env[59620]: DEBUG nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 703.655026] env[59620]: INFO nova.scheduler.client.report [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Deleted allocations for instance 8aec9e05-7685-4895-b375-6f5cd45e7a5f [ 703.663023] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.663023] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 703.663725] env[59620]: INFO nova.compute.claims [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 703.685155] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c20c6cda-8329-470d-bb0a-e8411f60673e tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "8aec9e05-7685-4895-b375-6f5cd45e7a5f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.281s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 703.718822] env[59620]: DEBUG nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 703.780210] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.794443] env[59620]: DEBUG nova.network.neutron [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 703.904726] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca077321-729d-4313-8189-cc5f23a5a59e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.914352] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32278d84-e33a-4537-b6a6-dadecdd7c13c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.948490] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4a6d2a3-31b5-4697-bc6b-4226bb0c6999 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.957544] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a60bbffc-382b-4fe3-8b9e-6fa53dc33ab0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.971494] env[59620]: DEBUG nova.compute.provider_tree [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 703.975684] env[59620]: DEBUG nova.network.neutron [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 703.983166] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Releasing lock "refresh_cache-1015f7da-bc69-489b-bb38-b31c1fe919a8" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 703.983413] env[59620]: DEBUG nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 703.983604] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 703.984373] env[59620]: DEBUG nova.scheduler.client.report [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 703.987740] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-61885716-ea08-4dca-9494-f2b677b829b5 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.998051] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2e20566-380d-4077-8f60-c4f7a691ad39 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.009855] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.348s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.010356] env[59620]: DEBUG nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 704.013472] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.233s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.014943] env[59620]: INFO nova.compute.claims [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 704.025675] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1015f7da-bc69-489b-bb38-b31c1fe919a8 could not be found. [ 704.025890] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 704.026053] env[59620]: INFO nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Took 0.04 seconds to destroy the instance on the hypervisor. [ 704.026278] env[59620]: DEBUG oslo.service.loopingcall [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 704.026493] env[59620]: DEBUG nova.compute.manager [-] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 704.026567] env[59620]: DEBUG nova.network.neutron [-] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 704.051456] env[59620]: DEBUG nova.compute.utils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 704.053460] env[59620]: DEBUG nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 704.053460] env[59620]: DEBUG nova.network.neutron [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 704.062884] env[59620]: DEBUG nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 704.103023] env[59620]: DEBUG nova.network.neutron [-] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 704.110727] env[59620]: DEBUG nova.network.neutron [-] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 704.126435] env[59620]: INFO nova.compute.manager [-] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Took 0.10 seconds to deallocate network for instance. [ 704.133401] env[59620]: DEBUG nova.compute.claims [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 704.133401] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.141521] env[59620]: DEBUG nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 704.168974] env[59620]: DEBUG nova.virt.hardware [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 704.169245] env[59620]: DEBUG nova.virt.hardware [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 704.169394] env[59620]: DEBUG nova.virt.hardware [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 704.169567] env[59620]: DEBUG nova.virt.hardware [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 704.169734] env[59620]: DEBUG nova.virt.hardware [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 704.169863] env[59620]: DEBUG nova.virt.hardware [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 704.170079] env[59620]: DEBUG nova.virt.hardware [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 704.170232] env[59620]: DEBUG nova.virt.hardware [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 704.170388] env[59620]: DEBUG nova.virt.hardware [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 704.170540] env[59620]: DEBUG nova.virt.hardware [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 704.170702] env[59620]: DEBUG nova.virt.hardware [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 704.171625] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dd0f432-4cfe-41cf-9480-5386ff4fdd5b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.182171] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91853fe0-f9a5-4202-9fd1-f9ba8787d6cc {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.242442] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4d45539-a08e-4220-80ce-10a73c781252 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.252658] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4baab705-22a5-4252-bc62-8c2483b803f8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.282381] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4f3a877-eb99-4309-b934-fbf999bf63e0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.290031] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be48b29f-8dd9-43aa-b163-f6315b1ab1e0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.306239] env[59620]: DEBUG nova.compute.provider_tree [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 704.317096] env[59620]: DEBUG nova.scheduler.client.report [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 704.336052] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.336540] env[59620]: DEBUG nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 704.339026] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.206s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.382653] env[59620]: DEBUG nova.compute.utils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 704.382971] env[59620]: DEBUG nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 704.382971] env[59620]: DEBUG nova.network.neutron [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 704.387028] env[59620]: DEBUG nova.policy [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f1da4498087e480aa7d065697ca021a7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '348e51e63ac440b1965e88712c05a3f1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 704.392659] env[59620]: DEBUG nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 704.446358] env[59620]: DEBUG nova.network.neutron [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 704.460888] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Releasing lock "refresh_cache-3157e7e4-fe8e-42b6-891b-ae0333b25f33" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 704.462134] env[59620]: DEBUG nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 704.462134] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5bab1952-aed4-44f0-8cd7-dfad05a726a4 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.471066] env[59620]: DEBUG nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 704.480289] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb24d1da-b0b7-412f-beaf-0c24278487a3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.502913] env[59620]: WARNING nova.virt.vmwareapi.driver [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Instance does not exists. Proceeding to delete instance properties on datastore: nova.exception.InstanceNotFound: Instance 3157e7e4-fe8e-42b6-891b-ae0333b25f33 could not be found. [ 704.503087] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 704.505789] env[59620]: DEBUG nova.virt.hardware [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:11:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='84',id=12,is_public=True,memory_mb=192,name='m1.micro',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 704.506022] env[59620]: DEBUG nova.virt.hardware [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 704.506176] env[59620]: DEBUG nova.virt.hardware [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 704.506353] env[59620]: DEBUG nova.virt.hardware [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 704.506490] env[59620]: DEBUG nova.virt.hardware [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 704.506628] env[59620]: DEBUG nova.virt.hardware [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 704.506834] env[59620]: DEBUG nova.virt.hardware [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 704.507026] env[59620]: DEBUG nova.virt.hardware [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 704.507200] env[59620]: DEBUG nova.virt.hardware [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 704.507358] env[59620]: DEBUG nova.virt.hardware [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 704.507519] env[59620]: DEBUG nova.virt.hardware [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 704.510732] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-07e9b001-5ee7-4b7b-bac8-bf825249ed2c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.513523] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4e0c35b-7659-44ec-ac31-8c55b05293d0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.535135] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07deb0f8-84a4-4eec-ba4e-59cae4157e99 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.543830] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b924e3e-e6b6-40b8-b20b-be3b29e10a1d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.573169] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3157e7e4-fe8e-42b6-891b-ae0333b25f33 could not be found. [ 704.573169] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 704.573448] env[59620]: INFO nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Took 0.11 seconds to destroy the instance on the hypervisor. [ 704.573630] env[59620]: DEBUG oslo.service.loopingcall [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 704.576563] env[59620]: DEBUG nova.compute.manager [-] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 704.576913] env[59620]: DEBUG nova.network.neutron [-] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 704.632281] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Acquiring lock "73b2fd88-ded1-4a92-a973-6a49e57faa5e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.632502] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "73b2fd88-ded1-4a92-a973-6a49e57faa5e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.636650] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b69b9333-c815-4342-be5d-8c82f4b0e822 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.639767] env[59620]: DEBUG nova.network.neutron [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Successfully created port: 913bdb45-f543-4a9e-8e8c-d569876ac3b1 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 704.647067] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-feac07e4-75b0-4102-a696-ea75e3dafc08 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.650708] env[59620]: DEBUG nova.network.neutron [-] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 704.683374] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dcb65b5-e6fb-4310-b1f3-1cf079207a97 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.686145] env[59620]: DEBUG nova.network.neutron [-] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 704.692789] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-701ecd37-5d4c-448e-b2d7-8a0fa0d24ee1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.697421] env[59620]: INFO nova.compute.manager [-] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Took 0.12 seconds to deallocate network for instance. [ 704.708143] env[59620]: DEBUG nova.compute.provider_tree [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 704.721762] env[59620]: DEBUG nova.scheduler.client.report [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 704.734554] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.395s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 704.735169] env[59620]: ERROR nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 1f720111-e6c1-4177-a050-ae028aa2b25a, please check neutron logs for more information. [ 704.735169] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Traceback (most recent call last): [ 704.735169] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 704.735169] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] self.driver.spawn(context, instance, image_meta, [ 704.735169] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 704.735169] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 704.735169] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 704.735169] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] vm_ref = self.build_virtual_machine(instance, [ 704.735169] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 704.735169] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] vif_infos = vmwarevif.get_vif_info(self._session, [ 704.735169] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 704.735473] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] for vif in network_info: [ 704.735473] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 704.735473] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] return self._sync_wrapper(fn, *args, **kwargs) [ 704.735473] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 704.735473] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] self.wait() [ 704.735473] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 704.735473] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] self[:] = self._gt.wait() [ 704.735473] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 704.735473] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] return self._exit_event.wait() [ 704.735473] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 704.735473] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] result = hub.switch() [ 704.735473] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 704.735473] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] return self.greenlet.switch() [ 704.735473] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 704.735793] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] result = function(*args, **kwargs) [ 704.735793] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 704.735793] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] return func(*args, **kwargs) [ 704.735793] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 704.735793] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] raise e [ 704.735793] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 704.735793] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] nwinfo = self.network_api.allocate_for_instance( [ 704.735793] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 704.735793] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] created_port_ids = self._update_ports_for_instance( [ 704.735793] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 704.735793] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] with excutils.save_and_reraise_exception(): [ 704.735793] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 704.735793] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] self.force_reraise() [ 704.736116] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 704.736116] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] raise self.value [ 704.736116] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 704.736116] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] updated_port = self._update_port( [ 704.736116] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 704.736116] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] _ensure_no_port_binding_failure(port) [ 704.736116] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 704.736116] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] raise exception.PortBindingFailed(port_id=port['id']) [ 704.736116] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] nova.exception.PortBindingFailed: Binding failed for port 1f720111-e6c1-4177-a050-ae028aa2b25a, please check neutron logs for more information. [ 704.736116] env[59620]: ERROR nova.compute.manager [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] [ 704.736116] env[59620]: DEBUG nova.compute.utils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Binding failed for port 1f720111-e6c1-4177-a050-ae028aa2b25a, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 704.738235] env[59620]: DEBUG nova.policy [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'db368a78ac5245d6a869b37de0fc1d2e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61af4231bb9c4fc2a2d742b7c3d1db40', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 704.740107] env[59620]: DEBUG nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Build of instance 1015f7da-bc69-489b-bb38-b31c1fe919a8 was re-scheduled: Binding failed for port 1f720111-e6c1-4177-a050-ae028aa2b25a, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 704.740460] env[59620]: DEBUG nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 704.740673] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Acquiring lock "refresh_cache-1015f7da-bc69-489b-bb38-b31c1fe919a8" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 704.740811] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Acquired lock "refresh_cache-1015f7da-bc69-489b-bb38-b31c1fe919a8" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 704.740963] env[59620]: DEBUG nova.network.neutron [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 704.770529] env[59620]: INFO nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Took 0.07 seconds to detach 1 volumes for instance. [ 704.772590] env[59620]: DEBUG nova.compute.claims [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 704.772753] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.772953] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.945218] env[59620]: DEBUG nova.network.neutron [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 704.956070] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-621a1635-0a87-4a70-94b9-db9d43191f47 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.963468] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9ca7de6-ce37-49b0-b812-b253af16fe05 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.994211] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4849049-1181-4075-beed-fe0a050ac305 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.002217] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7809409-0049-4c17-9b59-2f79dfaa41ef {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.016953] env[59620]: DEBUG nova.compute.provider_tree [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 705.024912] env[59620]: DEBUG nova.scheduler.client.report [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 705.039618] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.267s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.040238] env[59620]: ERROR nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port c6a56c1d-139f-43f1-9a39-a337feab33b5, please check neutron logs for more information. [ 705.040238] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Traceback (most recent call last): [ 705.040238] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 705.040238] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] self.driver.spawn(context, instance, image_meta, [ 705.040238] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 705.040238] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] self._vmops.spawn(context, instance, image_meta, injected_files, [ 705.040238] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 705.040238] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] vm_ref = self.build_virtual_machine(instance, [ 705.040238] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 705.040238] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] vif_infos = vmwarevif.get_vif_info(self._session, [ 705.040238] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 705.040563] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] for vif in network_info: [ 705.040563] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 705.040563] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] return self._sync_wrapper(fn, *args, **kwargs) [ 705.040563] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 705.040563] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] self.wait() [ 705.040563] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 705.040563] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] self[:] = self._gt.wait() [ 705.040563] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 705.040563] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] return self._exit_event.wait() [ 705.040563] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 705.040563] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] result = hub.switch() [ 705.040563] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 705.040563] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] return self.greenlet.switch() [ 705.040563] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 705.040953] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] result = function(*args, **kwargs) [ 705.040953] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 705.040953] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] return func(*args, **kwargs) [ 705.040953] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 705.040953] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] raise e [ 705.040953] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 705.040953] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] nwinfo = self.network_api.allocate_for_instance( [ 705.040953] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 705.040953] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] created_port_ids = self._update_ports_for_instance( [ 705.040953] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 705.040953] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] with excutils.save_and_reraise_exception(): [ 705.040953] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 705.040953] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] self.force_reraise() [ 705.041314] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 705.041314] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] raise self.value [ 705.041314] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 705.041314] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] updated_port = self._update_port( [ 705.041314] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 705.041314] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] _ensure_no_port_binding_failure(port) [ 705.041314] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 705.041314] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] raise exception.PortBindingFailed(port_id=port['id']) [ 705.041314] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] nova.exception.PortBindingFailed: Binding failed for port c6a56c1d-139f-43f1-9a39-a337feab33b5, please check neutron logs for more information. [ 705.041314] env[59620]: ERROR nova.compute.manager [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] [ 705.041314] env[59620]: DEBUG nova.compute.utils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Binding failed for port c6a56c1d-139f-43f1-9a39-a337feab33b5, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 705.042368] env[59620]: DEBUG nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Build of instance 3157e7e4-fe8e-42b6-891b-ae0333b25f33 was re-scheduled: Binding failed for port c6a56c1d-139f-43f1-9a39-a337feab33b5, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 705.042765] env[59620]: DEBUG nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 705.042981] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Acquiring lock "refresh_cache-3157e7e4-fe8e-42b6-891b-ae0333b25f33" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 705.043401] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Acquired lock "refresh_cache-3157e7e4-fe8e-42b6-891b-ae0333b25f33" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 705.043401] env[59620]: DEBUG nova.network.neutron [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 705.135625] env[59620]: DEBUG nova.network.neutron [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 705.534807] env[59620]: DEBUG nova.network.neutron [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 705.544774] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Releasing lock "refresh_cache-1015f7da-bc69-489b-bb38-b31c1fe919a8" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 705.544997] env[59620]: DEBUG nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 705.545186] env[59620]: DEBUG nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 705.545344] env[59620]: DEBUG nova.network.neutron [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 705.620982] env[59620]: DEBUG nova.network.neutron [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 705.631098] env[59620]: DEBUG nova.network.neutron [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 705.641335] env[59620]: INFO nova.compute.manager [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] [instance: 1015f7da-bc69-489b-bb38-b31c1fe919a8] Took 0.10 seconds to deallocate network for instance. [ 705.748206] env[59620]: INFO nova.scheduler.client.report [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Deleted allocations for instance 1015f7da-bc69-489b-bb38-b31c1fe919a8 [ 705.767086] env[59620]: DEBUG oslo_concurrency.lockutils [None req-2a496f26-98de-4e80-bd2f-f3a7e435a774 tempest-ServerActionsTestJSON-841667631 tempest-ServerActionsTestJSON-841667631-project-member] Lock "1015f7da-bc69-489b-bb38-b31c1fe919a8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.404s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.781661] env[59620]: DEBUG nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 705.838597] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.838831] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.840366] env[59620]: INFO nova.compute.claims [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 705.920051] env[59620]: DEBUG nova.network.neutron [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 705.949694] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Releasing lock "refresh_cache-3157e7e4-fe8e-42b6-891b-ae0333b25f33" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 705.949951] env[59620]: DEBUG nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 705.950142] env[59620]: DEBUG nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 705.950470] env[59620]: DEBUG nova.network.neutron [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 706.040616] env[59620]: DEBUG nova.network.neutron [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 706.051320] env[59620]: DEBUG nova.network.neutron [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.059814] env[59620]: INFO nova.compute.manager [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] [instance: 3157e7e4-fe8e-42b6-891b-ae0333b25f33] Took 0.11 seconds to deallocate network for instance. [ 706.064442] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5b1abdd-be7d-4193-9d32-ff705acc1418 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.074956] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a8903aa-2391-4fd4-b317-2d1897617a86 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.108145] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebd24dc7-4f3d-44ae-9f7f-cc6a0398382e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.116077] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46693025-01e7-40d3-8857-7edbeab73362 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.131271] env[59620]: DEBUG nova.compute.provider_tree [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 706.141234] env[59620]: DEBUG nova.scheduler.client.report [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 706.172723] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.334s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.198311] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Acquiring lock "64961436-8598-473d-aa60-e137daf18fe6" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.198860] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "64961436-8598-473d-aa60-e137daf18fe6" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: waited 0.001s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.208087] env[59620]: ERROR nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 2b350739-2c35-49e6-891a-61da85a49d31, please check neutron logs for more information. [ 706.208087] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 706.208087] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 706.208087] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 706.208087] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 706.208087] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 706.208087] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 706.208087] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 706.208087] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 706.208087] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 706.208087] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 706.208087] env[59620]: ERROR nova.compute.manager raise self.value [ 706.208087] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 706.208087] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 706.208087] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 706.208087] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 706.208491] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 706.208491] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 706.208491] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 2b350739-2c35-49e6-891a-61da85a49d31, please check neutron logs for more information. [ 706.208491] env[59620]: ERROR nova.compute.manager [ 706.208491] env[59620]: Traceback (most recent call last): [ 706.208491] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 706.208491] env[59620]: listener.cb(fileno) [ 706.208491] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 706.208491] env[59620]: result = function(*args, **kwargs) [ 706.208491] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 706.208491] env[59620]: return func(*args, **kwargs) [ 706.208491] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 706.208491] env[59620]: raise e [ 706.208491] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 706.208491] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 706.208491] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 706.208491] env[59620]: created_port_ids = self._update_ports_for_instance( [ 706.208491] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 706.208491] env[59620]: with excutils.save_and_reraise_exception(): [ 706.208491] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 706.208491] env[59620]: self.force_reraise() [ 706.208491] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 706.208491] env[59620]: raise self.value [ 706.208491] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 706.208491] env[59620]: updated_port = self._update_port( [ 706.208491] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 706.208491] env[59620]: _ensure_no_port_binding_failure(port) [ 706.208491] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 706.208491] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 706.209180] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 2b350739-2c35-49e6-891a-61da85a49d31, please check neutron logs for more information. [ 706.209180] env[59620]: Removing descriptor: 13 [ 706.209180] env[59620]: ERROR nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 2b350739-2c35-49e6-891a-61da85a49d31, please check neutron logs for more information. [ 706.209180] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Traceback (most recent call last): [ 706.209180] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 706.209180] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] yield resources [ 706.209180] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 706.209180] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] self.driver.spawn(context, instance, image_meta, [ 706.209180] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 706.209180] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 706.209180] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 706.209180] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] vm_ref = self.build_virtual_machine(instance, [ 706.209481] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 706.209481] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] vif_infos = vmwarevif.get_vif_info(self._session, [ 706.209481] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 706.209481] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] for vif in network_info: [ 706.209481] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 706.209481] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] return self._sync_wrapper(fn, *args, **kwargs) [ 706.209481] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 706.209481] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] self.wait() [ 706.209481] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 706.209481] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] self[:] = self._gt.wait() [ 706.209481] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 706.209481] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] return self._exit_event.wait() [ 706.209481] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 706.209481] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] result = hub.switch() [ 706.209841] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 706.209841] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] return self.greenlet.switch() [ 706.209841] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 706.209841] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] result = function(*args, **kwargs) [ 706.209841] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 706.209841] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] return func(*args, **kwargs) [ 706.209841] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 706.209841] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] raise e [ 706.209841] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 706.209841] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] nwinfo = self.network_api.allocate_for_instance( [ 706.209841] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 706.209841] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] created_port_ids = self._update_ports_for_instance( [ 706.209841] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 706.210167] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] with excutils.save_and_reraise_exception(): [ 706.210167] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 706.210167] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] self.force_reraise() [ 706.210167] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 706.210167] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] raise self.value [ 706.210167] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 706.210167] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] updated_port = self._update_port( [ 706.210167] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 706.210167] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] _ensure_no_port_binding_failure(port) [ 706.210167] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 706.210167] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] raise exception.PortBindingFailed(port_id=port['id']) [ 706.210167] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] nova.exception.PortBindingFailed: Binding failed for port 2b350739-2c35-49e6-891a-61da85a49d31, please check neutron logs for more information. [ 706.210167] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] [ 706.210497] env[59620]: INFO nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Terminating instance [ 706.214025] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "64961436-8598-473d-aa60-e137daf18fe6" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: held 0.012s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.214025] env[59620]: DEBUG nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 706.214025] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "refresh_cache-d65656c5-2cdb-4152-8e47-20d182d39c7a" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 706.214468] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquired lock "refresh_cache-d65656c5-2cdb-4152-8e47-20d182d39c7a" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 706.215062] env[59620]: DEBUG nova.network.neutron [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 706.229488] env[59620]: INFO nova.scheduler.client.report [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Deleted allocations for instance 3157e7e4-fe8e-42b6-891b-ae0333b25f33 [ 706.251831] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e7d3ab3a-b6e4-4e29-9faa-9d8b79887071 tempest-ServersTestBootFromVolume-1733111900 tempest-ServersTestBootFromVolume-1733111900-project-member] Lock "3157e7e4-fe8e-42b6-891b-ae0333b25f33" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.753s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.265838] env[59620]: DEBUG nova.compute.utils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 706.265838] env[59620]: DEBUG nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 706.272185] env[59620]: DEBUG nova.network.neutron [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 706.284278] env[59620]: DEBUG nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 706.297805] env[59620]: DEBUG nova.network.neutron [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 706.370185] env[59620]: DEBUG nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 706.404017] env[59620]: DEBUG nova.virt.hardware [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 706.404017] env[59620]: DEBUG nova.virt.hardware [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 706.404017] env[59620]: DEBUG nova.virt.hardware [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 706.404192] env[59620]: DEBUG nova.virt.hardware [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 706.404192] env[59620]: DEBUG nova.virt.hardware [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 706.404192] env[59620]: DEBUG nova.virt.hardware [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 706.404557] env[59620]: DEBUG nova.virt.hardware [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 706.405445] env[59620]: DEBUG nova.virt.hardware [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 706.405713] env[59620]: DEBUG nova.virt.hardware [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 706.405971] env[59620]: DEBUG nova.virt.hardware [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 706.408034] env[59620]: DEBUG nova.virt.hardware [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 706.408034] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b9a9c0e-1f17-4ec5-8690-c20caafd69ea {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.418823] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a20eabe0-6aee-4fc1-8f63-0e5c86bcb603 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.460903] env[59620]: DEBUG nova.network.neutron [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Successfully created port: ee509bee-9216-4859-ac9b-0313db219df0 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 706.574715] env[59620]: DEBUG nova.policy [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5858f0bb101c4a7dbd17a249240141be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2fd30bdaf134bbcb4e8e9840b095a9e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 706.653099] env[59620]: DEBUG nova.network.neutron [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Successfully created port: 50df6311-769b-4b6b-9bbd-76405db8df9b {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 706.715802] env[59620]: DEBUG nova.network.neutron [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.729648] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Releasing lock "refresh_cache-d65656c5-2cdb-4152-8e47-20d182d39c7a" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 706.730637] env[59620]: DEBUG nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 706.730637] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 706.731072] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f7c3e42b-644e-45d6-90f1-66bfc7ff553c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.740852] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64655a3b-fc65-4d15-899f-8d1414342cfb {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.774026] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d65656c5-2cdb-4152-8e47-20d182d39c7a could not be found. [ 706.774026] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 706.774026] env[59620]: INFO nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 706.774026] env[59620]: DEBUG oslo.service.loopingcall [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 706.774026] env[59620]: DEBUG nova.compute.manager [-] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 706.774568] env[59620]: DEBUG nova.network.neutron [-] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 706.825267] env[59620]: DEBUG nova.network.neutron [-] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 706.837489] env[59620]: DEBUG nova.network.neutron [-] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.846898] env[59620]: INFO nova.compute.manager [-] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Took 0.07 seconds to deallocate network for instance. [ 706.849375] env[59620]: ERROR nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port efe32174-7153-4005-a87b-4cc066a6d6b7, please check neutron logs for more information. [ 706.849375] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 706.849375] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 706.849375] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 706.849375] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 706.849375] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 706.849375] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 706.849375] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 706.849375] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 706.849375] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 706.849375] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 706.849375] env[59620]: ERROR nova.compute.manager raise self.value [ 706.849375] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 706.849375] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 706.849375] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 706.849375] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 706.849789] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 706.849789] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 706.849789] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port efe32174-7153-4005-a87b-4cc066a6d6b7, please check neutron logs for more information. [ 706.849789] env[59620]: ERROR nova.compute.manager [ 706.849789] env[59620]: Traceback (most recent call last): [ 706.849789] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 706.849789] env[59620]: listener.cb(fileno) [ 706.849789] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 706.849789] env[59620]: result = function(*args, **kwargs) [ 706.849789] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 706.849789] env[59620]: return func(*args, **kwargs) [ 706.849789] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 706.849789] env[59620]: raise e [ 706.849789] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 706.849789] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 706.849789] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 706.849789] env[59620]: created_port_ids = self._update_ports_for_instance( [ 706.849789] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 706.849789] env[59620]: with excutils.save_and_reraise_exception(): [ 706.849789] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 706.849789] env[59620]: self.force_reraise() [ 706.849789] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 706.849789] env[59620]: raise self.value [ 706.849789] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 706.849789] env[59620]: updated_port = self._update_port( [ 706.849789] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 706.849789] env[59620]: _ensure_no_port_binding_failure(port) [ 706.849789] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 706.849789] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 706.850578] env[59620]: nova.exception.PortBindingFailed: Binding failed for port efe32174-7153-4005-a87b-4cc066a6d6b7, please check neutron logs for more information. [ 706.850578] env[59620]: Removing descriptor: 21 [ 706.851394] env[59620]: ERROR nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port efe32174-7153-4005-a87b-4cc066a6d6b7, please check neutron logs for more information. [ 706.851394] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Traceback (most recent call last): [ 706.851394] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 706.851394] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] yield resources [ 706.851394] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 706.851394] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] self.driver.spawn(context, instance, image_meta, [ 706.851394] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 706.851394] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 706.851394] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 706.851394] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] vm_ref = self.build_virtual_machine(instance, [ 706.851394] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 706.851673] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] vif_infos = vmwarevif.get_vif_info(self._session, [ 706.851673] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 706.851673] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] for vif in network_info: [ 706.851673] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 706.851673] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] return self._sync_wrapper(fn, *args, **kwargs) [ 706.851673] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 706.851673] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] self.wait() [ 706.851673] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 706.851673] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] self[:] = self._gt.wait() [ 706.851673] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 706.851673] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] return self._exit_event.wait() [ 706.851673] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 706.851673] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] result = hub.switch() [ 706.851673] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 706.852047] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] return self.greenlet.switch() [ 706.852047] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 706.852047] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] result = function(*args, **kwargs) [ 706.852047] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 706.852047] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] return func(*args, **kwargs) [ 706.852047] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 706.852047] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] raise e [ 706.852047] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 706.852047] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] nwinfo = self.network_api.allocate_for_instance( [ 706.852047] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 706.852047] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] created_port_ids = self._update_ports_for_instance( [ 706.852047] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 706.852047] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] with excutils.save_and_reraise_exception(): [ 706.852338] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 706.852338] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] self.force_reraise() [ 706.852338] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 706.852338] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] raise self.value [ 706.852338] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 706.852338] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] updated_port = self._update_port( [ 706.852338] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 706.852338] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] _ensure_no_port_binding_failure(port) [ 706.852338] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 706.852338] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] raise exception.PortBindingFailed(port_id=port['id']) [ 706.852338] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] nova.exception.PortBindingFailed: Binding failed for port efe32174-7153-4005-a87b-4cc066a6d6b7, please check neutron logs for more information. [ 706.852338] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] [ 706.852672] env[59620]: INFO nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Terminating instance [ 706.855885] env[59620]: DEBUG nova.compute.claims [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 706.855885] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.855885] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.868182] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Acquiring lock "refresh_cache-06b62938-99d6-43a1-af87-aced894bc8d8" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 706.868182] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Acquired lock "refresh_cache-06b62938-99d6-43a1-af87-aced894bc8d8" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 706.868182] env[59620]: DEBUG nova.network.neutron [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 706.919495] env[59620]: DEBUG nova.network.neutron [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 707.089573] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9eacef18-1478-43d0-a745-a9546a9940bc {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.098981] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a51226a-cc8e-48d1-aa00-cfea0a5d8ee3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.132297] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2ac00ac-ef2b-4ce6-8ac3-f712e2d73aa0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.140886] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09f8d8b4-d74f-459c-8867-0c6b72f2b44d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.155706] env[59620]: DEBUG nova.compute.provider_tree [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 707.165114] env[59620]: ERROR nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 0eb03d48-4be8-4f39-82d5-41e84fcb3939, please check neutron logs for more information. [ 707.165114] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 707.165114] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 707.165114] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 707.165114] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 707.165114] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 707.165114] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 707.165114] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 707.165114] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 707.165114] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 707.165114] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 707.165114] env[59620]: ERROR nova.compute.manager raise self.value [ 707.165114] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 707.165114] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 707.165114] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 707.165114] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 707.165880] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 707.165880] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 707.165880] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 0eb03d48-4be8-4f39-82d5-41e84fcb3939, please check neutron logs for more information. [ 707.165880] env[59620]: ERROR nova.compute.manager [ 707.165880] env[59620]: Traceback (most recent call last): [ 707.165880] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 707.165880] env[59620]: listener.cb(fileno) [ 707.165880] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 707.165880] env[59620]: result = function(*args, **kwargs) [ 707.165880] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 707.165880] env[59620]: return func(*args, **kwargs) [ 707.165880] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 707.165880] env[59620]: raise e [ 707.165880] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 707.165880] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 707.165880] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 707.165880] env[59620]: created_port_ids = self._update_ports_for_instance( [ 707.165880] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 707.165880] env[59620]: with excutils.save_and_reraise_exception(): [ 707.165880] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 707.165880] env[59620]: self.force_reraise() [ 707.165880] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 707.165880] env[59620]: raise self.value [ 707.165880] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 707.165880] env[59620]: updated_port = self._update_port( [ 707.165880] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 707.165880] env[59620]: _ensure_no_port_binding_failure(port) [ 707.165880] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 707.165880] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 707.167640] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 0eb03d48-4be8-4f39-82d5-41e84fcb3939, please check neutron logs for more information. [ 707.167640] env[59620]: Removing descriptor: 22 [ 707.167640] env[59620]: ERROR nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 0eb03d48-4be8-4f39-82d5-41e84fcb3939, please check neutron logs for more information. [ 707.167640] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Traceback (most recent call last): [ 707.167640] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 707.167640] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] yield resources [ 707.167640] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 707.167640] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] self.driver.spawn(context, instance, image_meta, [ 707.167640] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 707.167640] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 707.167640] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 707.167640] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] vm_ref = self.build_virtual_machine(instance, [ 707.168623] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 707.168623] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] vif_infos = vmwarevif.get_vif_info(self._session, [ 707.168623] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 707.168623] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] for vif in network_info: [ 707.168623] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 707.168623] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] return self._sync_wrapper(fn, *args, **kwargs) [ 707.168623] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 707.168623] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] self.wait() [ 707.168623] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 707.168623] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] self[:] = self._gt.wait() [ 707.168623] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 707.168623] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] return self._exit_event.wait() [ 707.168623] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 707.168623] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] result = hub.switch() [ 707.169337] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 707.169337] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] return self.greenlet.switch() [ 707.169337] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 707.169337] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] result = function(*args, **kwargs) [ 707.169337] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 707.169337] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] return func(*args, **kwargs) [ 707.169337] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 707.169337] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] raise e [ 707.169337] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 707.169337] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] nwinfo = self.network_api.allocate_for_instance( [ 707.169337] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 707.169337] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] created_port_ids = self._update_ports_for_instance( [ 707.169337] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 707.169945] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] with excutils.save_and_reraise_exception(): [ 707.169945] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 707.169945] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] self.force_reraise() [ 707.169945] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 707.169945] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] raise self.value [ 707.169945] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 707.169945] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] updated_port = self._update_port( [ 707.169945] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 707.169945] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] _ensure_no_port_binding_failure(port) [ 707.169945] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 707.169945] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] raise exception.PortBindingFailed(port_id=port['id']) [ 707.169945] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] nova.exception.PortBindingFailed: Binding failed for port 0eb03d48-4be8-4f39-82d5-41e84fcb3939, please check neutron logs for more information. [ 707.169945] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] [ 707.170821] env[59620]: INFO nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Terminating instance [ 707.170821] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Acquiring lock "refresh_cache-630acd3e-e4e3-483b-984c-7023fd8c77d5" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 707.170821] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Acquired lock "refresh_cache-630acd3e-e4e3-483b-984c-7023fd8c77d5" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 707.170821] env[59620]: DEBUG nova.network.neutron [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 707.171028] env[59620]: DEBUG nova.scheduler.client.report [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 707.183830] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.328s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.184447] env[59620]: ERROR nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 2b350739-2c35-49e6-891a-61da85a49d31, please check neutron logs for more information. [ 707.184447] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Traceback (most recent call last): [ 707.184447] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 707.184447] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] self.driver.spawn(context, instance, image_meta, [ 707.184447] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 707.184447] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 707.184447] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 707.184447] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] vm_ref = self.build_virtual_machine(instance, [ 707.184447] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 707.184447] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] vif_infos = vmwarevif.get_vif_info(self._session, [ 707.184447] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 707.184816] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] for vif in network_info: [ 707.184816] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 707.184816] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] return self._sync_wrapper(fn, *args, **kwargs) [ 707.184816] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 707.184816] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] self.wait() [ 707.184816] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 707.184816] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] self[:] = self._gt.wait() [ 707.184816] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 707.184816] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] return self._exit_event.wait() [ 707.184816] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 707.184816] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] result = hub.switch() [ 707.184816] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 707.184816] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] return self.greenlet.switch() [ 707.184816] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 707.185174] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] result = function(*args, **kwargs) [ 707.185174] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 707.185174] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] return func(*args, **kwargs) [ 707.185174] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 707.185174] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] raise e [ 707.185174] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 707.185174] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] nwinfo = self.network_api.allocate_for_instance( [ 707.185174] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 707.185174] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] created_port_ids = self._update_ports_for_instance( [ 707.185174] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 707.185174] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] with excutils.save_and_reraise_exception(): [ 707.185174] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 707.185174] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] self.force_reraise() [ 707.185498] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 707.185498] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] raise self.value [ 707.185498] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 707.185498] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] updated_port = self._update_port( [ 707.185498] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 707.185498] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] _ensure_no_port_binding_failure(port) [ 707.185498] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 707.185498] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] raise exception.PortBindingFailed(port_id=port['id']) [ 707.185498] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] nova.exception.PortBindingFailed: Binding failed for port 2b350739-2c35-49e6-891a-61da85a49d31, please check neutron logs for more information. [ 707.185498] env[59620]: ERROR nova.compute.manager [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] [ 707.185498] env[59620]: DEBUG nova.compute.utils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Binding failed for port 2b350739-2c35-49e6-891a-61da85a49d31, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 707.187901] env[59620]: DEBUG nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Build of instance d65656c5-2cdb-4152-8e47-20d182d39c7a was re-scheduled: Binding failed for port 2b350739-2c35-49e6-891a-61da85a49d31, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 707.188351] env[59620]: DEBUG nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 707.188570] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "refresh_cache-d65656c5-2cdb-4152-8e47-20d182d39c7a" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 707.188713] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquired lock "refresh_cache-d65656c5-2cdb-4152-8e47-20d182d39c7a" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 707.188869] env[59620]: DEBUG nova.network.neutron [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 707.256569] env[59620]: DEBUG nova.network.neutron [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 707.364251] env[59620]: DEBUG nova.network.neutron [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 707.445196] env[59620]: DEBUG nova.network.neutron [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.458419] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Releasing lock "refresh_cache-06b62938-99d6-43a1-af87-aced894bc8d8" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 707.458419] env[59620]: DEBUG nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 707.458419] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8208149f-b310-4ac0-a7af-5d39a792da42 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.470842] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3b9f93d-7ca6-4cc9-a055-0b4354e943c2 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.497413] env[59620]: WARNING nova.virt.vmwareapi.driver [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Instance does not exists. Proceeding to delete instance properties on datastore: nova.exception.InstanceNotFound: Instance 06b62938-99d6-43a1-af87-aced894bc8d8 could not be found. [ 707.497545] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 707.497864] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e9a9cb18-46bb-4fa0-bbae-76fedf576b16 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.507658] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d57ed18-93fa-425e-89aa-2641e2bbd74d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.534852] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 06b62938-99d6-43a1-af87-aced894bc8d8 could not be found. [ 707.535076] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 707.535250] env[59620]: INFO nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Took 0.08 seconds to destroy the instance on the hypervisor. [ 707.535478] env[59620]: DEBUG oslo.service.loopingcall [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 707.535676] env[59620]: DEBUG nova.compute.manager [-] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 707.535764] env[59620]: DEBUG nova.network.neutron [-] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 707.611478] env[59620]: DEBUG nova.network.neutron [-] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 707.620601] env[59620]: DEBUG nova.network.neutron [-] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.631321] env[59620]: INFO nova.compute.manager [-] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Took 0.10 seconds to deallocate network for instance. [ 707.704843] env[59620]: INFO nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Took 0.07 seconds to detach 1 volumes for instance. [ 707.707891] env[59620]: DEBUG nova.compute.claims [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 707.707891] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.707891] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.711466] env[59620]: DEBUG nova.network.neutron [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.727057] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Releasing lock "refresh_cache-d65656c5-2cdb-4152-8e47-20d182d39c7a" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 707.728325] env[59620]: DEBUG nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 707.728325] env[59620]: DEBUG nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 707.728325] env[59620]: DEBUG nova.network.neutron [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 707.768350] env[59620]: DEBUG nova.network.neutron [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 707.777359] env[59620]: DEBUG nova.network.neutron [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.786958] env[59620]: DEBUG nova.network.neutron [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.791067] env[59620]: INFO nova.compute.manager [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: d65656c5-2cdb-4152-8e47-20d182d39c7a] Took 0.06 seconds to deallocate network for instance. [ 707.797468] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Releasing lock "refresh_cache-630acd3e-e4e3-483b-984c-7023fd8c77d5" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 707.798059] env[59620]: DEBUG nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 707.798536] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 707.801479] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7d2ca6ba-57f3-4c9a-ae42-fe6217ba5daf {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.812376] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d354d50-0b8e-400c-913b-0bc9d4dddf26 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.838856] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 630acd3e-e4e3-483b-984c-7023fd8c77d5 could not be found. [ 707.839109] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 707.839283] env[59620]: INFO nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 707.839514] env[59620]: DEBUG oslo.service.loopingcall [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 707.843210] env[59620]: DEBUG nova.compute.manager [-] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 707.843291] env[59620]: DEBUG nova.network.neutron [-] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 707.881212] env[59620]: INFO nova.scheduler.client.report [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Deleted allocations for instance d65656c5-2cdb-4152-8e47-20d182d39c7a [ 707.907180] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb8aeec1-f9fc-4384-b67c-4de11960bfa9 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "d65656c5-2cdb-4152-8e47-20d182d39c7a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.139s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.920253] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-384508a1-5a75-43e6-ae53-840f625b684a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.925571] env[59620]: DEBUG nova.network.neutron [-] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 707.932370] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e554df11-21ac-4775-928d-81d00ed984f5 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.936636] env[59620]: DEBUG nova.network.neutron [-] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.970618] env[59620]: INFO nova.compute.manager [-] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Took 0.13 seconds to deallocate network for instance. [ 707.971453] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdb01fbf-2303-44ec-b700-f816ebd3e9c1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.976757] env[59620]: DEBUG nova.compute.claims [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 707.977031] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.980964] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8d9c3c0-b245-434f-b373-1e4846c89f8d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.995561] env[59620]: DEBUG nova.compute.provider_tree [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 708.003320] env[59620]: DEBUG nova.scheduler.client.report [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 708.019637] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.312s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 708.020328] env[59620]: ERROR nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port efe32174-7153-4005-a87b-4cc066a6d6b7, please check neutron logs for more information. [ 708.020328] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Traceback (most recent call last): [ 708.020328] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 708.020328] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] self.driver.spawn(context, instance, image_meta, [ 708.020328] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 708.020328] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 708.020328] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 708.020328] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] vm_ref = self.build_virtual_machine(instance, [ 708.020328] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 708.020328] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] vif_infos = vmwarevif.get_vif_info(self._session, [ 708.020328] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 708.020944] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] for vif in network_info: [ 708.020944] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 708.020944] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] return self._sync_wrapper(fn, *args, **kwargs) [ 708.020944] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 708.020944] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] self.wait() [ 708.020944] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 708.020944] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] self[:] = self._gt.wait() [ 708.020944] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 708.020944] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] return self._exit_event.wait() [ 708.020944] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 708.020944] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] result = hub.switch() [ 708.020944] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 708.020944] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] return self.greenlet.switch() [ 708.020944] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 708.021341] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] result = function(*args, **kwargs) [ 708.021341] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 708.021341] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] return func(*args, **kwargs) [ 708.021341] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 708.021341] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] raise e [ 708.021341] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 708.021341] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] nwinfo = self.network_api.allocate_for_instance( [ 708.021341] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 708.021341] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] created_port_ids = self._update_ports_for_instance( [ 708.021341] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 708.021341] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] with excutils.save_and_reraise_exception(): [ 708.021341] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 708.021341] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] self.force_reraise() [ 708.021804] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 708.021804] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] raise self.value [ 708.021804] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 708.021804] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] updated_port = self._update_port( [ 708.021804] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 708.021804] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] _ensure_no_port_binding_failure(port) [ 708.021804] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 708.021804] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] raise exception.PortBindingFailed(port_id=port['id']) [ 708.021804] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] nova.exception.PortBindingFailed: Binding failed for port efe32174-7153-4005-a87b-4cc066a6d6b7, please check neutron logs for more information. [ 708.021804] env[59620]: ERROR nova.compute.manager [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] [ 708.021804] env[59620]: DEBUG nova.compute.utils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Binding failed for port efe32174-7153-4005-a87b-4cc066a6d6b7, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 708.022612] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.046s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 708.025761] env[59620]: DEBUG nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Build of instance 06b62938-99d6-43a1-af87-aced894bc8d8 was re-scheduled: Binding failed for port efe32174-7153-4005-a87b-4cc066a6d6b7, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 708.026198] env[59620]: DEBUG nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 708.026506] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Acquiring lock "refresh_cache-06b62938-99d6-43a1-af87-aced894bc8d8" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 708.026581] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Acquired lock "refresh_cache-06b62938-99d6-43a1-af87-aced894bc8d8" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 708.026683] env[59620]: DEBUG nova.network.neutron [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 708.182247] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5499edc-aa54-48b1-9ddc-ce4ba529bde9 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.190527] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30509adf-2409-4cb6-8b3c-86e50f759538 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.220175] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4ade074-2e7c-40c9-bfbb-98e7c6dfbd07 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.227405] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49647a12-171e-4522-9b08-97927c8a3570 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.241674] env[59620]: DEBUG nova.compute.provider_tree [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 708.249989] env[59620]: DEBUG nova.scheduler.client.report [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 708.263323] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.241s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 708.263906] env[59620]: ERROR nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 0eb03d48-4be8-4f39-82d5-41e84fcb3939, please check neutron logs for more information. [ 708.263906] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Traceback (most recent call last): [ 708.263906] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 708.263906] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] self.driver.spawn(context, instance, image_meta, [ 708.263906] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 708.263906] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 708.263906] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 708.263906] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] vm_ref = self.build_virtual_machine(instance, [ 708.263906] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 708.263906] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] vif_infos = vmwarevif.get_vif_info(self._session, [ 708.263906] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 708.264294] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] for vif in network_info: [ 708.264294] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 708.264294] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] return self._sync_wrapper(fn, *args, **kwargs) [ 708.264294] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 708.264294] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] self.wait() [ 708.264294] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 708.264294] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] self[:] = self._gt.wait() [ 708.264294] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 708.264294] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] return self._exit_event.wait() [ 708.264294] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 708.264294] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] result = hub.switch() [ 708.264294] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 708.264294] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] return self.greenlet.switch() [ 708.264294] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 708.265377] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] result = function(*args, **kwargs) [ 708.265377] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 708.265377] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] return func(*args, **kwargs) [ 708.265377] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 708.265377] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] raise e [ 708.265377] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 708.265377] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] nwinfo = self.network_api.allocate_for_instance( [ 708.265377] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 708.265377] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] created_port_ids = self._update_ports_for_instance( [ 708.265377] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 708.265377] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] with excutils.save_and_reraise_exception(): [ 708.265377] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 708.265377] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] self.force_reraise() [ 708.265696] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 708.265696] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] raise self.value [ 708.265696] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 708.265696] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] updated_port = self._update_port( [ 708.265696] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 708.265696] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] _ensure_no_port_binding_failure(port) [ 708.265696] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 708.265696] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] raise exception.PortBindingFailed(port_id=port['id']) [ 708.265696] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] nova.exception.PortBindingFailed: Binding failed for port 0eb03d48-4be8-4f39-82d5-41e84fcb3939, please check neutron logs for more information. [ 708.265696] env[59620]: ERROR nova.compute.manager [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] [ 708.265696] env[59620]: DEBUG nova.compute.utils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Binding failed for port 0eb03d48-4be8-4f39-82d5-41e84fcb3939, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 708.267600] env[59620]: DEBUG nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Build of instance 630acd3e-e4e3-483b-984c-7023fd8c77d5 was re-scheduled: Binding failed for port 0eb03d48-4be8-4f39-82d5-41e84fcb3939, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 708.268031] env[59620]: DEBUG nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 708.268257] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Acquiring lock "refresh_cache-630acd3e-e4e3-483b-984c-7023fd8c77d5" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 708.268400] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Acquired lock "refresh_cache-630acd3e-e4e3-483b-984c-7023fd8c77d5" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 708.268551] env[59620]: DEBUG nova.network.neutron [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 708.297148] env[59620]: DEBUG nova.network.neutron [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 708.373509] env[59620]: DEBUG nova.network.neutron [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 708.735597] env[59620]: DEBUG nova.network.neutron [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.752429] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Releasing lock "refresh_cache-630acd3e-e4e3-483b-984c-7023fd8c77d5" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 708.752618] env[59620]: DEBUG nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 708.752808] env[59620]: DEBUG nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 708.753032] env[59620]: DEBUG nova.network.neutron [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 708.786237] env[59620]: DEBUG nova.network.neutron [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Successfully created port: f55d6baf-cc51-4f55-88ec-78d6f8b9c411 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 708.810508] env[59620]: DEBUG nova.network.neutron [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 708.824095] env[59620]: DEBUG nova.network.neutron [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.840427] env[59620]: INFO nova.compute.manager [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] [instance: 630acd3e-e4e3-483b-984c-7023fd8c77d5] Took 0.09 seconds to deallocate network for instance. [ 708.916783] env[59620]: DEBUG nova.network.neutron [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.936768] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Releasing lock "refresh_cache-06b62938-99d6-43a1-af87-aced894bc8d8" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 708.936824] env[59620]: DEBUG nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 708.940021] env[59620]: DEBUG nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 708.940021] env[59620]: DEBUG nova.network.neutron [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 708.966209] env[59620]: INFO nova.scheduler.client.report [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Deleted allocations for instance 630acd3e-e4e3-483b-984c-7023fd8c77d5 [ 708.991781] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e76fd938-6cea-43d0-a2d5-b39b455f0e5f tempest-ServerMetadataTestJSON-521948342 tempest-ServerMetadataTestJSON-521948342-project-member] Lock "630acd3e-e4e3-483b-984c-7023fd8c77d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.851s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.025889] env[59620]: DEBUG nova.network.neutron [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 709.037631] env[59620]: DEBUG nova.network.neutron [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 709.051406] env[59620]: INFO nova.compute.manager [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] [instance: 06b62938-99d6-43a1-af87-aced894bc8d8] Took 0.11 seconds to deallocate network for instance. [ 709.176251] env[59620]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Acquiring lock "21f0c0f4-9fbd-4d34-be36-6dae6538bf9c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.176545] env[59620]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "21f0c0f4-9fbd-4d34-be36-6dae6538bf9c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.189172] env[59620]: DEBUG nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 709.200052] env[59620]: INFO nova.scheduler.client.report [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Deleted allocations for instance 06b62938-99d6-43a1-af87-aced894bc8d8 [ 709.229692] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b77ca966-bc92-488d-ad60-79efd9f488e9 tempest-ServerActionsV293TestJSON-186183883 tempest-ServerActionsV293TestJSON-186183883-project-member] Lock "06b62938-99d6-43a1-af87-aced894bc8d8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.609s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.263627] env[59620]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.264464] env[59620]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.265998] env[59620]: INFO nova.compute.claims [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 709.450781] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d489fb6-819b-4567-be46-1849fdd32d9b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.460098] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62c22079-99cf-4ed0-a8ec-b2d4e771bd10 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.499305] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a383690-edb6-4b58-a5b5-79d86f45e474 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.508043] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f276821f-9955-4ed1-ac5d-326479479b18 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.522718] env[59620]: DEBUG nova.compute.provider_tree [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 709.534523] env[59620]: DEBUG nova.scheduler.client.report [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 709.549686] env[59620]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 709.550237] env[59620]: DEBUG nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 709.592659] env[59620]: DEBUG nova.compute.utils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 709.593909] env[59620]: DEBUG nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 709.594145] env[59620]: DEBUG nova.network.neutron [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 709.603650] env[59620]: DEBUG nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 709.654971] env[59620]: ERROR nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port bc0d8fad-86d1-4cf5-9df4-6db7cc8f5e4d, please check neutron logs for more information. [ 709.654971] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 709.654971] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 709.654971] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 709.654971] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 709.654971] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 709.654971] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 709.654971] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 709.654971] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 709.654971] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 709.654971] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 709.654971] env[59620]: ERROR nova.compute.manager raise self.value [ 709.654971] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 709.654971] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 709.654971] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 709.654971] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 709.655512] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 709.655512] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 709.655512] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port bc0d8fad-86d1-4cf5-9df4-6db7cc8f5e4d, please check neutron logs for more information. [ 709.655512] env[59620]: ERROR nova.compute.manager [ 709.655512] env[59620]: Traceback (most recent call last): [ 709.655512] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 709.655512] env[59620]: listener.cb(fileno) [ 709.655512] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 709.655512] env[59620]: result = function(*args, **kwargs) [ 709.655512] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 709.655512] env[59620]: return func(*args, **kwargs) [ 709.655512] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 709.655512] env[59620]: raise e [ 709.655512] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 709.655512] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 709.655512] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 709.655512] env[59620]: created_port_ids = self._update_ports_for_instance( [ 709.655512] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 709.655512] env[59620]: with excutils.save_and_reraise_exception(): [ 709.655512] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 709.655512] env[59620]: self.force_reraise() [ 709.655512] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 709.655512] env[59620]: raise self.value [ 709.655512] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 709.655512] env[59620]: updated_port = self._update_port( [ 709.655512] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 709.655512] env[59620]: _ensure_no_port_binding_failure(port) [ 709.655512] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 709.655512] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 709.656250] env[59620]: nova.exception.PortBindingFailed: Binding failed for port bc0d8fad-86d1-4cf5-9df4-6db7cc8f5e4d, please check neutron logs for more information. [ 709.656250] env[59620]: Removing descriptor: 16 [ 709.656250] env[59620]: ERROR nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port bc0d8fad-86d1-4cf5-9df4-6db7cc8f5e4d, please check neutron logs for more information. [ 709.656250] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Traceback (most recent call last): [ 709.656250] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 709.656250] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] yield resources [ 709.656250] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 709.656250] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] self.driver.spawn(context, instance, image_meta, [ 709.656250] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 709.656250] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 709.656250] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 709.656250] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] vm_ref = self.build_virtual_machine(instance, [ 709.656577] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 709.656577] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] vif_infos = vmwarevif.get_vif_info(self._session, [ 709.656577] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 709.656577] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] for vif in network_info: [ 709.656577] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 709.656577] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] return self._sync_wrapper(fn, *args, **kwargs) [ 709.656577] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 709.656577] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] self.wait() [ 709.656577] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 709.656577] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] self[:] = self._gt.wait() [ 709.656577] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 709.656577] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] return self._exit_event.wait() [ 709.656577] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 709.656577] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] result = hub.switch() [ 709.656901] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 709.656901] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] return self.greenlet.switch() [ 709.656901] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 709.656901] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] result = function(*args, **kwargs) [ 709.656901] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 709.656901] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] return func(*args, **kwargs) [ 709.656901] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 709.656901] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] raise e [ 709.656901] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 709.656901] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] nwinfo = self.network_api.allocate_for_instance( [ 709.656901] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 709.656901] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] created_port_ids = self._update_ports_for_instance( [ 709.656901] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 709.657280] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] with excutils.save_and_reraise_exception(): [ 709.657280] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 709.657280] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] self.force_reraise() [ 709.657280] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 709.657280] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] raise self.value [ 709.657280] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 709.657280] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] updated_port = self._update_port( [ 709.657280] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 709.657280] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] _ensure_no_port_binding_failure(port) [ 709.657280] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 709.657280] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] raise exception.PortBindingFailed(port_id=port['id']) [ 709.657280] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] nova.exception.PortBindingFailed: Binding failed for port bc0d8fad-86d1-4cf5-9df4-6db7cc8f5e4d, please check neutron logs for more information. [ 709.657280] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] [ 709.657647] env[59620]: INFO nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Terminating instance [ 709.657876] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "refresh_cache-588eb672-6240-46cd-8e93-b38c9e2829bf" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 709.658077] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquired lock "refresh_cache-588eb672-6240-46cd-8e93-b38c9e2829bf" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 709.658291] env[59620]: DEBUG nova.network.neutron [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 709.678482] env[59620]: DEBUG nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 709.700284] env[59620]: DEBUG nova.virt.hardware [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 709.700534] env[59620]: DEBUG nova.virt.hardware [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 709.700687] env[59620]: DEBUG nova.virt.hardware [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 709.700864] env[59620]: DEBUG nova.virt.hardware [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 709.701039] env[59620]: DEBUG nova.virt.hardware [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 709.701234] env[59620]: DEBUG nova.virt.hardware [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 709.701453] env[59620]: DEBUG nova.virt.hardware [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 709.701622] env[59620]: DEBUG nova.virt.hardware [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 709.701785] env[59620]: DEBUG nova.virt.hardware [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 709.701941] env[59620]: DEBUG nova.virt.hardware [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 709.702124] env[59620]: DEBUG nova.virt.hardware [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 709.702968] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f62044f-f9fd-41fa-bc73-3a3943a637cd {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.711467] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a48a45b7-8bf9-41d7-843b-14feff61cb44 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 709.752840] env[59620]: DEBUG nova.network.neutron [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 709.877143] env[59620]: ERROR nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 89928e6b-9db2-4a6a-8579-49fb7f3d9e4f, please check neutron logs for more information. [ 709.877143] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 709.877143] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 709.877143] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 709.877143] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 709.877143] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 709.877143] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 709.877143] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 709.877143] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 709.877143] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 709.877143] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 709.877143] env[59620]: ERROR nova.compute.manager raise self.value [ 709.877143] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 709.877143] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 709.877143] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 709.877143] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 709.877626] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 709.877626] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 709.877626] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 89928e6b-9db2-4a6a-8579-49fb7f3d9e4f, please check neutron logs for more information. [ 709.877626] env[59620]: ERROR nova.compute.manager [ 709.877626] env[59620]: Traceback (most recent call last): [ 709.877626] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 709.877626] env[59620]: listener.cb(fileno) [ 709.877626] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 709.877626] env[59620]: result = function(*args, **kwargs) [ 709.877626] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 709.877626] env[59620]: return func(*args, **kwargs) [ 709.877626] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 709.877626] env[59620]: raise e [ 709.877626] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 709.877626] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 709.877626] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 709.877626] env[59620]: created_port_ids = self._update_ports_for_instance( [ 709.877626] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 709.877626] env[59620]: with excutils.save_and_reraise_exception(): [ 709.877626] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 709.877626] env[59620]: self.force_reraise() [ 709.877626] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 709.877626] env[59620]: raise self.value [ 709.877626] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 709.877626] env[59620]: updated_port = self._update_port( [ 709.877626] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 709.877626] env[59620]: _ensure_no_port_binding_failure(port) [ 709.877626] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 709.877626] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 709.878765] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 89928e6b-9db2-4a6a-8579-49fb7f3d9e4f, please check neutron logs for more information. [ 709.878765] env[59620]: Removing descriptor: 23 [ 709.878765] env[59620]: ERROR nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 89928e6b-9db2-4a6a-8579-49fb7f3d9e4f, please check neutron logs for more information. [ 709.878765] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Traceback (most recent call last): [ 709.878765] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 709.878765] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] yield resources [ 709.878765] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 709.878765] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] self.driver.spawn(context, instance, image_meta, [ 709.878765] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 709.878765] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] self._vmops.spawn(context, instance, image_meta, injected_files, [ 709.878765] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 709.878765] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] vm_ref = self.build_virtual_machine(instance, [ 709.879106] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 709.879106] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] vif_infos = vmwarevif.get_vif_info(self._session, [ 709.879106] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 709.879106] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] for vif in network_info: [ 709.879106] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 709.879106] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] return self._sync_wrapper(fn, *args, **kwargs) [ 709.879106] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 709.879106] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] self.wait() [ 709.879106] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 709.879106] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] self[:] = self._gt.wait() [ 709.879106] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 709.879106] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] return self._exit_event.wait() [ 709.879106] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 709.879106] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] result = hub.switch() [ 709.879557] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 709.879557] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] return self.greenlet.switch() [ 709.879557] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 709.879557] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] result = function(*args, **kwargs) [ 709.879557] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 709.879557] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] return func(*args, **kwargs) [ 709.879557] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 709.879557] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] raise e [ 709.879557] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 709.879557] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] nwinfo = self.network_api.allocate_for_instance( [ 709.879557] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 709.879557] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] created_port_ids = self._update_ports_for_instance( [ 709.879557] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 709.879924] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] with excutils.save_and_reraise_exception(): [ 709.879924] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 709.879924] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] self.force_reraise() [ 709.879924] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 709.879924] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] raise self.value [ 709.879924] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 709.879924] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] updated_port = self._update_port( [ 709.879924] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 709.879924] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] _ensure_no_port_binding_failure(port) [ 709.879924] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 709.879924] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] raise exception.PortBindingFailed(port_id=port['id']) [ 709.879924] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] nova.exception.PortBindingFailed: Binding failed for port 89928e6b-9db2-4a6a-8579-49fb7f3d9e4f, please check neutron logs for more information. [ 709.879924] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] [ 709.880350] env[59620]: INFO nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Terminating instance [ 709.880761] env[59620]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Acquiring lock "refresh_cache-3f53c35f-40ea-4094-89e2-624b156e5560" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 709.880954] env[59620]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Acquired lock "refresh_cache-3f53c35f-40ea-4094-89e2-624b156e5560" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 709.881150] env[59620]: DEBUG nova.network.neutron [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 709.898559] env[59620]: DEBUG nova.policy [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '65bfc207577b49578dd11f62fa61e23e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '15e5a73b9fc04caaa60d4dcc2f8b0380', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 709.971729] env[59620]: DEBUG nova.network.neutron [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 710.169121] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Acquiring lock "16b35372-2e84-4f6c-ab01-fcbc86e9cca0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.169121] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "16b35372-2e84-4f6c-ab01-fcbc86e9cca0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.182077] env[59620]: DEBUG nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 710.233022] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.233022] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.233022] env[59620]: INFO nova.compute.claims [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 710.440113] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36287326-789a-47b9-8c09-f217f304ed95 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.447579] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5aca97e2-e316-4d13-b4e2-5af851641f31 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.480582] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba9d15c2-217d-434c-bc72-b01c910ba756 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.488956] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fe6d628-758f-4699-9f15-a92b28b66c78 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.502760] env[59620]: DEBUG nova.compute.provider_tree [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 710.517211] env[59620]: DEBUG nova.scheduler.client.report [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 710.536084] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 710.536084] env[59620]: DEBUG nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 710.578606] env[59620]: DEBUG nova.compute.utils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 710.579309] env[59620]: DEBUG nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 710.579603] env[59620]: DEBUG nova.network.neutron [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 710.584435] env[59620]: DEBUG nova.network.neutron [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 710.587389] env[59620]: DEBUG nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 710.597028] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Releasing lock "refresh_cache-588eb672-6240-46cd-8e93-b38c9e2829bf" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 710.597028] env[59620]: DEBUG nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 710.597028] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 710.597028] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1bbffe28-8b32-4d75-84a6-e9aaeb7dd720 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.606747] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4baebf9e-e0a4-46e4-b41a-f75b86935223 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.635655] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 588eb672-6240-46cd-8e93-b38c9e2829bf could not be found. [ 710.635655] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 710.635655] env[59620]: INFO nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Took 0.04 seconds to destroy the instance on the hypervisor. [ 710.635655] env[59620]: DEBUG oslo.service.loopingcall [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 710.635655] env[59620]: DEBUG nova.compute.manager [-] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 710.635944] env[59620]: DEBUG nova.network.neutron [-] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 710.673926] env[59620]: DEBUG nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 710.698281] env[59620]: DEBUG nova.virt.hardware [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 710.698528] env[59620]: DEBUG nova.virt.hardware [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 710.698681] env[59620]: DEBUG nova.virt.hardware [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 710.698853] env[59620]: DEBUG nova.virt.hardware [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 710.698994] env[59620]: DEBUG nova.virt.hardware [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 710.701984] env[59620]: DEBUG nova.virt.hardware [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 710.702182] env[59620]: DEBUG nova.virt.hardware [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 710.702350] env[59620]: DEBUG nova.virt.hardware [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 710.702518] env[59620]: DEBUG nova.virt.hardware [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 710.702678] env[59620]: DEBUG nova.virt.hardware [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 710.702936] env[59620]: DEBUG nova.virt.hardware [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 710.704190] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bd92f0d-c52e-411a-96f7-c4f0abe5505f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.713342] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abee2f9b-f0b2-495e-9c6e-c26743d1eca1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.728706] env[59620]: DEBUG nova.policy [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '49ac2f7c40cb472d9b9678a537ea011f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '57e287820a334ab58ec0c42d68339b66', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 710.743647] env[59620]: DEBUG nova.network.neutron [-] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 710.751817] env[59620]: DEBUG nova.network.neutron [-] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 710.761606] env[59620]: INFO nova.compute.manager [-] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Took 0.13 seconds to deallocate network for instance. [ 710.764211] env[59620]: DEBUG nova.compute.claims [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 710.764342] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.764550] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.820095] env[59620]: DEBUG nova.network.neutron [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 710.829905] env[59620]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Releasing lock "refresh_cache-3f53c35f-40ea-4094-89e2-624b156e5560" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 710.830479] env[59620]: DEBUG nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 710.830737] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 710.831383] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-11b81944-5344-47d0-8c94-a432bf13d5ba {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.845737] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e74cb143-b991-4a2b-ad66-071bb0419584 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.873691] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3f53c35f-40ea-4094-89e2-624b156e5560 could not be found. [ 710.874236] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 710.874420] env[59620]: INFO nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Took 0.04 seconds to destroy the instance on the hypervisor. [ 710.874667] env[59620]: DEBUG oslo.service.loopingcall [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 710.877844] env[59620]: DEBUG nova.compute.manager [-] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 710.877956] env[59620]: DEBUG nova.network.neutron [-] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 710.959486] env[59620]: ERROR nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port ee509bee-9216-4859-ac9b-0313db219df0, please check neutron logs for more information. [ 710.959486] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 710.959486] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 710.959486] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 710.959486] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 710.959486] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 710.959486] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 710.959486] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 710.959486] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 710.959486] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 710.959486] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 710.959486] env[59620]: ERROR nova.compute.manager raise self.value [ 710.959486] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 710.959486] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 710.959486] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 710.959486] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 710.960050] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 710.960050] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 710.960050] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port ee509bee-9216-4859-ac9b-0313db219df0, please check neutron logs for more information. [ 710.960050] env[59620]: ERROR nova.compute.manager [ 710.960050] env[59620]: Traceback (most recent call last): [ 710.960050] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 710.960050] env[59620]: listener.cb(fileno) [ 710.960050] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 710.960050] env[59620]: result = function(*args, **kwargs) [ 710.960050] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 710.960050] env[59620]: return func(*args, **kwargs) [ 710.960050] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 710.960050] env[59620]: raise e [ 710.960050] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 710.960050] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 710.960050] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 710.960050] env[59620]: created_port_ids = self._update_ports_for_instance( [ 710.960050] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 710.960050] env[59620]: with excutils.save_and_reraise_exception(): [ 710.960050] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 710.960050] env[59620]: self.force_reraise() [ 710.960050] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 710.960050] env[59620]: raise self.value [ 710.960050] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 710.960050] env[59620]: updated_port = self._update_port( [ 710.960050] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 710.960050] env[59620]: _ensure_no_port_binding_failure(port) [ 710.960050] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 710.960050] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 710.960732] env[59620]: nova.exception.PortBindingFailed: Binding failed for port ee509bee-9216-4859-ac9b-0313db219df0, please check neutron logs for more information. [ 710.960732] env[59620]: Removing descriptor: 11 [ 710.960732] env[59620]: ERROR nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port ee509bee-9216-4859-ac9b-0313db219df0, please check neutron logs for more information. [ 710.960732] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Traceback (most recent call last): [ 710.960732] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 710.960732] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] yield resources [ 710.960732] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 710.960732] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] self.driver.spawn(context, instance, image_meta, [ 710.960732] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 710.960732] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 710.960732] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 710.960732] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] vm_ref = self.build_virtual_machine(instance, [ 710.961061] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 710.961061] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] vif_infos = vmwarevif.get_vif_info(self._session, [ 710.961061] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 710.961061] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] for vif in network_info: [ 710.961061] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 710.961061] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] return self._sync_wrapper(fn, *args, **kwargs) [ 710.961061] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 710.961061] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] self.wait() [ 710.961061] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 710.961061] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] self[:] = self._gt.wait() [ 710.961061] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 710.961061] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] return self._exit_event.wait() [ 710.961061] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 710.961061] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] result = hub.switch() [ 710.961415] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 710.961415] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] return self.greenlet.switch() [ 710.961415] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 710.961415] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] result = function(*args, **kwargs) [ 710.961415] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 710.961415] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] return func(*args, **kwargs) [ 710.961415] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 710.961415] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] raise e [ 710.961415] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 710.961415] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] nwinfo = self.network_api.allocate_for_instance( [ 710.961415] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 710.961415] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] created_port_ids = self._update_ports_for_instance( [ 710.961415] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 710.962630] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] with excutils.save_and_reraise_exception(): [ 710.962630] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 710.962630] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] self.force_reraise() [ 710.962630] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 710.962630] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] raise self.value [ 710.962630] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 710.962630] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] updated_port = self._update_port( [ 710.962630] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 710.962630] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] _ensure_no_port_binding_failure(port) [ 710.962630] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 710.962630] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] raise exception.PortBindingFailed(port_id=port['id']) [ 710.962630] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] nova.exception.PortBindingFailed: Binding failed for port ee509bee-9216-4859-ac9b-0313db219df0, please check neutron logs for more information. [ 710.962630] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] [ 710.962999] env[59620]: INFO nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Terminating instance [ 710.963958] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Acquiring lock "refresh_cache-e31d29d1-c49c-4696-85c9-11cb985a7bfd" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 710.964129] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Acquired lock "refresh_cache-e31d29d1-c49c-4696-85c9-11cb985a7bfd" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 710.964507] env[59620]: DEBUG nova.network.neutron [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 710.981185] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f3d7f1d-ba66-4d21-885d-adcff38344f9 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.992518] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a92a01b8-3267-4895-8864-4a30129dd5f4 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 710.999034] env[59620]: DEBUG nova.network.neutron [-] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 711.031213] env[59620]: DEBUG nova.network.neutron [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 711.033842] env[59620]: DEBUG nova.network.neutron [-] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 711.037461] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55764b85-39ef-4f3b-b2e3-017e7ef9e9f2 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.046989] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdfb2197-c4a3-46d4-aeb5-52182f548cf5 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.049806] env[59620]: INFO nova.compute.manager [-] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Took 0.17 seconds to deallocate network for instance. [ 711.059489] env[59620]: DEBUG nova.compute.provider_tree [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 711.060796] env[59620]: DEBUG nova.compute.claims [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 711.061013] env[59620]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.074424] env[59620]: DEBUG nova.scheduler.client.report [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 711.092398] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.328s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.093026] env[59620]: ERROR nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port bc0d8fad-86d1-4cf5-9df4-6db7cc8f5e4d, please check neutron logs for more information. [ 711.093026] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Traceback (most recent call last): [ 711.093026] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 711.093026] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] self.driver.spawn(context, instance, image_meta, [ 711.093026] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 711.093026] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 711.093026] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 711.093026] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] vm_ref = self.build_virtual_machine(instance, [ 711.093026] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 711.093026] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] vif_infos = vmwarevif.get_vif_info(self._session, [ 711.093026] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 711.093459] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] for vif in network_info: [ 711.093459] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 711.093459] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] return self._sync_wrapper(fn, *args, **kwargs) [ 711.093459] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 711.093459] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] self.wait() [ 711.093459] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 711.093459] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] self[:] = self._gt.wait() [ 711.093459] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 711.093459] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] return self._exit_event.wait() [ 711.093459] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 711.093459] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] result = hub.switch() [ 711.093459] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 711.093459] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] return self.greenlet.switch() [ 711.093459] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 711.093797] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] result = function(*args, **kwargs) [ 711.093797] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 711.093797] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] return func(*args, **kwargs) [ 711.093797] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 711.093797] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] raise e [ 711.093797] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 711.093797] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] nwinfo = self.network_api.allocate_for_instance( [ 711.093797] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 711.093797] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] created_port_ids = self._update_ports_for_instance( [ 711.093797] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 711.093797] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] with excutils.save_and_reraise_exception(): [ 711.093797] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 711.093797] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] self.force_reraise() [ 711.094103] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 711.094103] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] raise self.value [ 711.094103] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 711.094103] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] updated_port = self._update_port( [ 711.094103] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 711.094103] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] _ensure_no_port_binding_failure(port) [ 711.094103] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 711.094103] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] raise exception.PortBindingFailed(port_id=port['id']) [ 711.094103] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] nova.exception.PortBindingFailed: Binding failed for port bc0d8fad-86d1-4cf5-9df4-6db7cc8f5e4d, please check neutron logs for more information. [ 711.094103] env[59620]: ERROR nova.compute.manager [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] [ 711.094401] env[59620]: DEBUG nova.compute.utils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Binding failed for port bc0d8fad-86d1-4cf5-9df4-6db7cc8f5e4d, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 711.094845] env[59620]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.034s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.097497] env[59620]: DEBUG nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Build of instance 588eb672-6240-46cd-8e93-b38c9e2829bf was re-scheduled: Binding failed for port bc0d8fad-86d1-4cf5-9df4-6db7cc8f5e4d, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 711.097941] env[59620]: DEBUG nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 711.098551] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "refresh_cache-588eb672-6240-46cd-8e93-b38c9e2829bf" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 711.098551] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquired lock "refresh_cache-588eb672-6240-46cd-8e93-b38c9e2829bf" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 711.098551] env[59620]: DEBUG nova.network.neutron [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 711.209872] env[59620]: DEBUG nova.network.neutron [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 711.290822] env[59620]: DEBUG nova.network.neutron [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Successfully created port: 2112df5f-4ebb-4b2d-b9eb-3bca9d4e9655 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 711.293589] env[59620]: DEBUG nova.network.neutron [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 711.307314] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Releasing lock "refresh_cache-e31d29d1-c49c-4696-85c9-11cb985a7bfd" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 711.307314] env[59620]: DEBUG nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 711.308029] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 711.308029] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ca0ffe3a-808a-4842-afd9-cd285b3f2803 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.316863] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e0222ef-099b-40d9-a4b7-1527bddc3bfb {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.325816] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab697ae1-7551-4687-b0b6-6a36284341dc {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.331912] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-693cbbc5-6f94-4d26-ba09-7c7865ffe016 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.381550] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-128b43ab-5fb6-4aa2-ac22-d7efce1209ae {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.386380] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e31d29d1-c49c-4696-85c9-11cb985a7bfd could not be found. [ 711.386453] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 711.386669] env[59620]: INFO nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Took 0.08 seconds to destroy the instance on the hypervisor. [ 711.387068] env[59620]: DEBUG oslo.service.loopingcall [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 711.387595] env[59620]: DEBUG nova.compute.manager [-] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 711.387595] env[59620]: DEBUG nova.network.neutron [-] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 711.394139] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efad3f0b-d289-4bac-826b-e6b0876a945d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.411023] env[59620]: DEBUG nova.compute.provider_tree [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 711.411845] env[59620]: DEBUG nova.network.neutron [-] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 711.420483] env[59620]: DEBUG nova.network.neutron [-] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 711.423806] env[59620]: DEBUG nova.scheduler.client.report [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 711.430856] env[59620]: INFO nova.compute.manager [-] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Took 0.04 seconds to deallocate network for instance. [ 711.433048] env[59620]: DEBUG nova.compute.claims [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 711.434069] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.438325] env[59620]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.343s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.439335] env[59620]: ERROR nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 89928e6b-9db2-4a6a-8579-49fb7f3d9e4f, please check neutron logs for more information. [ 711.439335] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Traceback (most recent call last): [ 711.439335] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 711.439335] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] self.driver.spawn(context, instance, image_meta, [ 711.439335] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 711.439335] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] self._vmops.spawn(context, instance, image_meta, injected_files, [ 711.439335] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 711.439335] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] vm_ref = self.build_virtual_machine(instance, [ 711.439335] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 711.439335] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] vif_infos = vmwarevif.get_vif_info(self._session, [ 711.439335] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 711.439664] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] for vif in network_info: [ 711.439664] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 711.439664] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] return self._sync_wrapper(fn, *args, **kwargs) [ 711.439664] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 711.439664] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] self.wait() [ 711.439664] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 711.439664] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] self[:] = self._gt.wait() [ 711.439664] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 711.439664] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] return self._exit_event.wait() [ 711.439664] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 711.439664] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] result = hub.switch() [ 711.439664] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 711.439664] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] return self.greenlet.switch() [ 711.439664] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 711.440044] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] result = function(*args, **kwargs) [ 711.440044] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 711.440044] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] return func(*args, **kwargs) [ 711.440044] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 711.440044] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] raise e [ 711.440044] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 711.440044] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] nwinfo = self.network_api.allocate_for_instance( [ 711.440044] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 711.440044] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] created_port_ids = self._update_ports_for_instance( [ 711.440044] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 711.440044] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] with excutils.save_and_reraise_exception(): [ 711.440044] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 711.440044] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] self.force_reraise() [ 711.440365] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 711.440365] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] raise self.value [ 711.440365] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 711.440365] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] updated_port = self._update_port( [ 711.440365] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 711.440365] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] _ensure_no_port_binding_failure(port) [ 711.440365] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 711.440365] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] raise exception.PortBindingFailed(port_id=port['id']) [ 711.440365] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] nova.exception.PortBindingFailed: Binding failed for port 89928e6b-9db2-4a6a-8579-49fb7f3d9e4f, please check neutron logs for more information. [ 711.440365] env[59620]: ERROR nova.compute.manager [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] [ 711.440791] env[59620]: DEBUG nova.compute.utils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Binding failed for port 89928e6b-9db2-4a6a-8579-49fb7f3d9e4f, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 711.441959] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.009s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.448517] env[59620]: DEBUG nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Build of instance 3f53c35f-40ea-4094-89e2-624b156e5560 was re-scheduled: Binding failed for port 89928e6b-9db2-4a6a-8579-49fb7f3d9e4f, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 711.448517] env[59620]: DEBUG nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 711.448517] env[59620]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Acquiring lock "refresh_cache-3f53c35f-40ea-4094-89e2-624b156e5560" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 711.448517] env[59620]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Acquired lock "refresh_cache-3f53c35f-40ea-4094-89e2-624b156e5560" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 711.448826] env[59620]: DEBUG nova.network.neutron [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 711.503170] env[59620]: DEBUG nova.network.neutron [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 711.625975] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77110641-e412-48c8-89fe-092875f530ec {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.634316] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3424eca-6a57-4e55-bde3-2eb9b18d7475 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.664967] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e77aac27-c2fb-431c-9af6-f5766e488f8c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.672559] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26ee6ce9-adb2-4003-881e-f8109f3e17e9 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.685744] env[59620]: DEBUG nova.compute.provider_tree [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 711.694156] env[59620]: DEBUG nova.scheduler.client.report [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 711.712448] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.270s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.713074] env[59620]: ERROR nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port ee509bee-9216-4859-ac9b-0313db219df0, please check neutron logs for more information. [ 711.713074] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Traceback (most recent call last): [ 711.713074] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 711.713074] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] self.driver.spawn(context, instance, image_meta, [ 711.713074] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 711.713074] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 711.713074] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 711.713074] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] vm_ref = self.build_virtual_machine(instance, [ 711.713074] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 711.713074] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] vif_infos = vmwarevif.get_vif_info(self._session, [ 711.713074] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 711.713372] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] for vif in network_info: [ 711.713372] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 711.713372] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] return self._sync_wrapper(fn, *args, **kwargs) [ 711.713372] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 711.713372] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] self.wait() [ 711.713372] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 711.713372] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] self[:] = self._gt.wait() [ 711.713372] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 711.713372] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] return self._exit_event.wait() [ 711.713372] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 711.713372] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] result = hub.switch() [ 711.713372] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 711.713372] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] return self.greenlet.switch() [ 711.713372] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 711.713687] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] result = function(*args, **kwargs) [ 711.713687] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 711.713687] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] return func(*args, **kwargs) [ 711.713687] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 711.713687] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] raise e [ 711.713687] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 711.713687] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] nwinfo = self.network_api.allocate_for_instance( [ 711.713687] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 711.713687] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] created_port_ids = self._update_ports_for_instance( [ 711.713687] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 711.713687] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] with excutils.save_and_reraise_exception(): [ 711.713687] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 711.713687] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] self.force_reraise() [ 711.713981] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 711.713981] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] raise self.value [ 711.713981] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 711.713981] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] updated_port = self._update_port( [ 711.713981] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 711.713981] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] _ensure_no_port_binding_failure(port) [ 711.713981] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 711.713981] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] raise exception.PortBindingFailed(port_id=port['id']) [ 711.713981] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] nova.exception.PortBindingFailed: Binding failed for port ee509bee-9216-4859-ac9b-0313db219df0, please check neutron logs for more information. [ 711.713981] env[59620]: ERROR nova.compute.manager [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] [ 711.714221] env[59620]: DEBUG nova.compute.utils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Binding failed for port ee509bee-9216-4859-ac9b-0313db219df0, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 711.715362] env[59620]: DEBUG nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Build of instance e31d29d1-c49c-4696-85c9-11cb985a7bfd was re-scheduled: Binding failed for port ee509bee-9216-4859-ac9b-0313db219df0, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 711.715779] env[59620]: DEBUG nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 711.716594] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Acquiring lock "refresh_cache-e31d29d1-c49c-4696-85c9-11cb985a7bfd" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 711.716594] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Acquired lock "refresh_cache-e31d29d1-c49c-4696-85c9-11cb985a7bfd" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 711.716594] env[59620]: DEBUG nova.network.neutron [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 711.739987] env[59620]: DEBUG nova.network.neutron [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 711.749635] env[59620]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Releasing lock "refresh_cache-3f53c35f-40ea-4094-89e2-624b156e5560" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 711.749924] env[59620]: DEBUG nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 711.750247] env[59620]: DEBUG nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 711.750428] env[59620]: DEBUG nova.network.neutron [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 711.781171] env[59620]: DEBUG nova.network.neutron [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 711.789533] env[59620]: DEBUG nova.network.neutron [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 711.797600] env[59620]: INFO nova.compute.manager [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] [instance: 3f53c35f-40ea-4094-89e2-624b156e5560] Took 0.05 seconds to deallocate network for instance. [ 711.832882] env[59620]: DEBUG nova.network.neutron [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 711.913392] env[59620]: INFO nova.scheduler.client.report [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Deleted allocations for instance 3f53c35f-40ea-4094-89e2-624b156e5560 [ 711.937144] env[59620]: DEBUG oslo_concurrency.lockutils [None req-03d19712-5d7d-41b1-bf45-b29d7202674e tempest-FloatingIPsAssociationNegativeTestJSON-641891694 tempest-FloatingIPsAssociationNegativeTestJSON-641891694-project-member] Lock "3f53c35f-40ea-4094-89e2-624b156e5560" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.747s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 712.020356] env[59620]: DEBUG nova.network.neutron [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 712.029300] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Releasing lock "refresh_cache-588eb672-6240-46cd-8e93-b38c9e2829bf" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 712.029523] env[59620]: DEBUG nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 712.029698] env[59620]: DEBUG nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 712.029880] env[59620]: DEBUG nova.network.neutron [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 712.155603] env[59620]: DEBUG nova.network.neutron [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 712.166900] env[59620]: DEBUG nova.network.neutron [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 712.176019] env[59620]: INFO nova.compute.manager [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 588eb672-6240-46cd-8e93-b38c9e2829bf] Took 0.14 seconds to deallocate network for instance. [ 712.285983] env[59620]: INFO nova.scheduler.client.report [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Deleted allocations for instance 588eb672-6240-46cd-8e93-b38c9e2829bf [ 712.308887] env[59620]: DEBUG oslo_concurrency.lockutils [None req-d55aa827-a7ac-46c0-81bd-11eb87ba0ab2 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "588eb672-6240-46cd-8e93-b38c9e2829bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.908s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 712.577316] env[59620]: DEBUG nova.network.neutron [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Successfully created port: a5869fae-2063-4a18-99cd-ceda0844e417 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 712.806147] env[59620]: ERROR nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 2112df5f-4ebb-4b2d-b9eb-3bca9d4e9655, please check neutron logs for more information. [ 712.806147] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 712.806147] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 712.806147] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 712.806147] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 712.806147] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 712.806147] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 712.806147] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 712.806147] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 712.806147] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 712.806147] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 712.806147] env[59620]: ERROR nova.compute.manager raise self.value [ 712.806147] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 712.806147] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 712.806147] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 712.806147] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 712.806990] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 712.806990] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 712.806990] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 2112df5f-4ebb-4b2d-b9eb-3bca9d4e9655, please check neutron logs for more information. [ 712.806990] env[59620]: ERROR nova.compute.manager [ 712.806990] env[59620]: Traceback (most recent call last): [ 712.806990] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 712.806990] env[59620]: listener.cb(fileno) [ 712.806990] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 712.806990] env[59620]: result = function(*args, **kwargs) [ 712.806990] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 712.806990] env[59620]: return func(*args, **kwargs) [ 712.806990] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 712.806990] env[59620]: raise e [ 712.806990] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 712.806990] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 712.806990] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 712.806990] env[59620]: created_port_ids = self._update_ports_for_instance( [ 712.806990] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 712.806990] env[59620]: with excutils.save_and_reraise_exception(): [ 712.806990] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 712.806990] env[59620]: self.force_reraise() [ 712.806990] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 712.806990] env[59620]: raise self.value [ 712.806990] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 712.806990] env[59620]: updated_port = self._update_port( [ 712.806990] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 712.806990] env[59620]: _ensure_no_port_binding_failure(port) [ 712.806990] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 712.806990] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 712.807716] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 2112df5f-4ebb-4b2d-b9eb-3bca9d4e9655, please check neutron logs for more information. [ 712.807716] env[59620]: Removing descriptor: 13 [ 712.807716] env[59620]: ERROR nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 2112df5f-4ebb-4b2d-b9eb-3bca9d4e9655, please check neutron logs for more information. [ 712.807716] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Traceback (most recent call last): [ 712.807716] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 712.807716] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] yield resources [ 712.807716] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 712.807716] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] self.driver.spawn(context, instance, image_meta, [ 712.807716] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 712.807716] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 712.807716] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 712.807716] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] vm_ref = self.build_virtual_machine(instance, [ 712.808290] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 712.808290] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] vif_infos = vmwarevif.get_vif_info(self._session, [ 712.808290] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 712.808290] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] for vif in network_info: [ 712.808290] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 712.808290] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] return self._sync_wrapper(fn, *args, **kwargs) [ 712.808290] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 712.808290] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] self.wait() [ 712.808290] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 712.808290] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] self[:] = self._gt.wait() [ 712.808290] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 712.808290] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] return self._exit_event.wait() [ 712.808290] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 712.808290] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] result = hub.switch() [ 712.808687] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 712.808687] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] return self.greenlet.switch() [ 712.808687] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 712.808687] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] result = function(*args, **kwargs) [ 712.808687] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 712.808687] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] return func(*args, **kwargs) [ 712.808687] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 712.808687] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] raise e [ 712.808687] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 712.808687] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] nwinfo = self.network_api.allocate_for_instance( [ 712.808687] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 712.808687] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] created_port_ids = self._update_ports_for_instance( [ 712.808687] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 712.809123] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] with excutils.save_and_reraise_exception(): [ 712.809123] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 712.809123] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] self.force_reraise() [ 712.809123] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 712.809123] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] raise self.value [ 712.809123] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 712.809123] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] updated_port = self._update_port( [ 712.809123] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 712.809123] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] _ensure_no_port_binding_failure(port) [ 712.809123] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 712.809123] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] raise exception.PortBindingFailed(port_id=port['id']) [ 712.809123] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] nova.exception.PortBindingFailed: Binding failed for port 2112df5f-4ebb-4b2d-b9eb-3bca9d4e9655, please check neutron logs for more information. [ 712.809123] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] [ 712.809477] env[59620]: INFO nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Terminating instance [ 712.810324] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Acquiring lock "refresh_cache-16b35372-2e84-4f6c-ab01-fcbc86e9cca0" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 712.810324] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Acquired lock "refresh_cache-16b35372-2e84-4f6c-ab01-fcbc86e9cca0" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 712.810324] env[59620]: DEBUG nova.network.neutron [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 712.847295] env[59620]: DEBUG nova.network.neutron [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 712.874398] env[59620]: DEBUG nova.network.neutron [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 712.886988] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Releasing lock "refresh_cache-e31d29d1-c49c-4696-85c9-11cb985a7bfd" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 712.887253] env[59620]: DEBUG nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 712.887434] env[59620]: DEBUG nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 712.887592] env[59620]: DEBUG nova.network.neutron [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 712.985769] env[59620]: DEBUG nova.network.neutron [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 712.993947] env[59620]: DEBUG nova.network.neutron [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.002481] env[59620]: INFO nova.compute.manager [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] [instance: e31d29d1-c49c-4696-85c9-11cb985a7bfd] Took 0.11 seconds to deallocate network for instance. [ 713.092338] env[59620]: INFO nova.scheduler.client.report [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Deleted allocations for instance e31d29d1-c49c-4696-85c9-11cb985a7bfd [ 713.112752] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b291991a-1815-4df6-85ef-360a6665a47f tempest-ImagesOneServerNegativeTestJSON-1962047279 tempest-ImagesOneServerNegativeTestJSON-1962047279-project-member] Lock "e31d29d1-c49c-4696-85c9-11cb985a7bfd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.860s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.202033] env[59620]: DEBUG nova.network.neutron [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.211900] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Releasing lock "refresh_cache-16b35372-2e84-4f6c-ab01-fcbc86e9cca0" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 713.212399] env[59620]: DEBUG nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 713.212866] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 713.213403] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-37220af4-5ce8-47fe-82fb-96998ff31202 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.223547] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48c7d4ab-9e2e-4860-98ad-9e5653515581 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.250808] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 16b35372-2e84-4f6c-ab01-fcbc86e9cca0 could not be found. [ 713.251128] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 713.251356] env[59620]: INFO nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 713.251621] env[59620]: DEBUG oslo.service.loopingcall [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 713.251999] env[59620]: DEBUG nova.compute.manager [-] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 713.252164] env[59620]: DEBUG nova.network.neutron [-] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 713.273775] env[59620]: DEBUG nova.network.neutron [-] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 713.282151] env[59620]: DEBUG nova.network.neutron [-] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.292943] env[59620]: INFO nova.compute.manager [-] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Took 0.04 seconds to deallocate network for instance. [ 713.295847] env[59620]: DEBUG nova.compute.claims [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 713.296241] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.296512] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.430727] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9a1cfd7-33ac-49bd-8cbe-84bacb2cf7a0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.438920] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e509d97-1f41-48d2-80f9-49094a837f18 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.472467] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25449999-8b73-48af-820e-0204320cf677 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.483031] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-048ae626-7c39-414e-97c3-5bc660882718 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.498371] env[59620]: DEBUG nova.compute.provider_tree [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 713.511365] env[59620]: DEBUG nova.scheduler.client.report [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 713.534159] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.237s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.534833] env[59620]: ERROR nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 2112df5f-4ebb-4b2d-b9eb-3bca9d4e9655, please check neutron logs for more information. [ 713.534833] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Traceback (most recent call last): [ 713.534833] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 713.534833] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] self.driver.spawn(context, instance, image_meta, [ 713.534833] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 713.534833] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 713.534833] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 713.534833] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] vm_ref = self.build_virtual_machine(instance, [ 713.534833] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 713.534833] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] vif_infos = vmwarevif.get_vif_info(self._session, [ 713.534833] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 713.535153] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] for vif in network_info: [ 713.535153] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 713.535153] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] return self._sync_wrapper(fn, *args, **kwargs) [ 713.535153] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 713.535153] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] self.wait() [ 713.535153] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 713.535153] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] self[:] = self._gt.wait() [ 713.535153] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 713.535153] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] return self._exit_event.wait() [ 713.535153] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 713.535153] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] result = hub.switch() [ 713.535153] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 713.535153] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] return self.greenlet.switch() [ 713.535153] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 713.535648] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] result = function(*args, **kwargs) [ 713.535648] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 713.535648] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] return func(*args, **kwargs) [ 713.535648] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 713.535648] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] raise e [ 713.535648] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 713.535648] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] nwinfo = self.network_api.allocate_for_instance( [ 713.535648] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 713.535648] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] created_port_ids = self._update_ports_for_instance( [ 713.535648] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 713.535648] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] with excutils.save_and_reraise_exception(): [ 713.535648] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 713.535648] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] self.force_reraise() [ 713.535998] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 713.535998] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] raise self.value [ 713.535998] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 713.535998] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] updated_port = self._update_port( [ 713.535998] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 713.535998] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] _ensure_no_port_binding_failure(port) [ 713.535998] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 713.535998] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] raise exception.PortBindingFailed(port_id=port['id']) [ 713.535998] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] nova.exception.PortBindingFailed: Binding failed for port 2112df5f-4ebb-4b2d-b9eb-3bca9d4e9655, please check neutron logs for more information. [ 713.535998] env[59620]: ERROR nova.compute.manager [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] [ 713.536293] env[59620]: DEBUG nova.compute.utils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Binding failed for port 2112df5f-4ebb-4b2d-b9eb-3bca9d4e9655, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 713.538111] env[59620]: DEBUG nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Build of instance 16b35372-2e84-4f6c-ab01-fcbc86e9cca0 was re-scheduled: Binding failed for port 2112df5f-4ebb-4b2d-b9eb-3bca9d4e9655, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 713.540221] env[59620]: DEBUG nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 713.540221] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Acquiring lock "refresh_cache-16b35372-2e84-4f6c-ab01-fcbc86e9cca0" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 713.540221] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Acquired lock "refresh_cache-16b35372-2e84-4f6c-ab01-fcbc86e9cca0" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 713.540221] env[59620]: DEBUG nova.network.neutron [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 713.586644] env[59620]: DEBUG nova.network.neutron [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 713.661475] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "2e990d70-8e51-4900-9d9d-db920311a8ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.661758] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "2e990d70-8e51-4900-9d9d-db920311a8ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.676710] env[59620]: DEBUG nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 713.734100] env[59620]: ERROR nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 913bdb45-f543-4a9e-8e8c-d569876ac3b1, please check neutron logs for more information. [ 713.734100] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 713.734100] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 713.734100] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 713.734100] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 713.734100] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 713.734100] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 713.734100] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 713.734100] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 713.734100] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 713.734100] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 713.734100] env[59620]: ERROR nova.compute.manager raise self.value [ 713.734100] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 713.734100] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 713.734100] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 713.734100] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 713.734700] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 713.734700] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 713.734700] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 913bdb45-f543-4a9e-8e8c-d569876ac3b1, please check neutron logs for more information. [ 713.734700] env[59620]: ERROR nova.compute.manager [ 713.734700] env[59620]: Traceback (most recent call last): [ 713.734700] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 713.734700] env[59620]: listener.cb(fileno) [ 713.734700] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 713.734700] env[59620]: result = function(*args, **kwargs) [ 713.734700] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 713.734700] env[59620]: return func(*args, **kwargs) [ 713.734700] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 713.734700] env[59620]: raise e [ 713.734700] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 713.734700] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 713.734700] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 713.734700] env[59620]: created_port_ids = self._update_ports_for_instance( [ 713.734700] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 713.734700] env[59620]: with excutils.save_and_reraise_exception(): [ 713.734700] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 713.734700] env[59620]: self.force_reraise() [ 713.734700] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 713.734700] env[59620]: raise self.value [ 713.734700] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 713.734700] env[59620]: updated_port = self._update_port( [ 713.734700] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 713.734700] env[59620]: _ensure_no_port_binding_failure(port) [ 713.734700] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 713.734700] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 713.735736] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 913bdb45-f543-4a9e-8e8c-d569876ac3b1, please check neutron logs for more information. [ 713.735736] env[59620]: Removing descriptor: 19 [ 713.735736] env[59620]: ERROR nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 913bdb45-f543-4a9e-8e8c-d569876ac3b1, please check neutron logs for more information. [ 713.735736] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Traceback (most recent call last): [ 713.735736] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 713.735736] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] yield resources [ 713.735736] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 713.735736] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] self.driver.spawn(context, instance, image_meta, [ 713.735736] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 713.735736] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] self._vmops.spawn(context, instance, image_meta, injected_files, [ 713.735736] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 713.735736] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] vm_ref = self.build_virtual_machine(instance, [ 713.736079] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 713.736079] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] vif_infos = vmwarevif.get_vif_info(self._session, [ 713.736079] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 713.736079] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] for vif in network_info: [ 713.736079] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 713.736079] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] return self._sync_wrapper(fn, *args, **kwargs) [ 713.736079] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 713.736079] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] self.wait() [ 713.736079] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 713.736079] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] self[:] = self._gt.wait() [ 713.736079] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 713.736079] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] return self._exit_event.wait() [ 713.736079] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 713.736079] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] result = hub.switch() [ 713.736420] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 713.736420] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] return self.greenlet.switch() [ 713.736420] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 713.736420] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] result = function(*args, **kwargs) [ 713.736420] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 713.736420] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] return func(*args, **kwargs) [ 713.736420] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 713.736420] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] raise e [ 713.736420] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 713.736420] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] nwinfo = self.network_api.allocate_for_instance( [ 713.736420] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 713.736420] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] created_port_ids = self._update_ports_for_instance( [ 713.736420] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 713.736730] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] with excutils.save_and_reraise_exception(): [ 713.736730] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 713.736730] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] self.force_reraise() [ 713.736730] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 713.736730] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] raise self.value [ 713.736730] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 713.736730] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] updated_port = self._update_port( [ 713.736730] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 713.736730] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] _ensure_no_port_binding_failure(port) [ 713.736730] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 713.736730] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] raise exception.PortBindingFailed(port_id=port['id']) [ 713.736730] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] nova.exception.PortBindingFailed: Binding failed for port 913bdb45-f543-4a9e-8e8c-d569876ac3b1, please check neutron logs for more information. [ 713.736730] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] [ 713.737091] env[59620]: INFO nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Terminating instance [ 713.737091] env[59620]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Acquiring lock "refresh_cache-e89d07fc-9c98-4352-b609-c7fde7ee0d39" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 713.737091] env[59620]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Acquired lock "refresh_cache-e89d07fc-9c98-4352-b609-c7fde7ee0d39" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 713.737091] env[59620]: DEBUG nova.network.neutron [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 713.743383] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.743492] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.744903] env[59620]: INFO nova.compute.claims [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 713.827616] env[59620]: DEBUG nova.network.neutron [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.843806] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Releasing lock "refresh_cache-16b35372-2e84-4f6c-ab01-fcbc86e9cca0" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 713.844028] env[59620]: DEBUG nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 713.844205] env[59620]: DEBUG nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 713.844358] env[59620]: DEBUG nova.network.neutron [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 713.865893] env[59620]: DEBUG nova.network.neutron [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 713.880528] env[59620]: DEBUG nova.network.neutron [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 713.889586] env[59620]: DEBUG nova.network.neutron [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.901017] env[59620]: INFO nova.compute.manager [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] [instance: 16b35372-2e84-4f6c-ab01-fcbc86e9cca0] Took 0.05 seconds to deallocate network for instance. [ 713.947023] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65cecccc-e2f0-4b11-bcc2-89a25e32482a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.955548] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a810d5f1-908a-4f1d-8b14-b72e4aa719b4 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.997610] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dae15285-ce91-4abd-8fb9-a4090c6146d3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.005984] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57aa64be-d720-4d2a-a9cb-758f8c50a1c3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.025252] env[59620]: DEBUG nova.compute.provider_tree [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 714.029236] env[59620]: INFO nova.scheduler.client.report [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Deleted allocations for instance 16b35372-2e84-4f6c-ab01-fcbc86e9cca0 [ 714.041154] env[59620]: DEBUG nova.scheduler.client.report [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 714.059078] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5259ae5b-1054-4537-82e4-8f80bef07ec1 tempest-ServersNegativeTestJSON-1843327250 tempest-ServersNegativeTestJSON-1843327250-project-member] Lock "16b35372-2e84-4f6c-ab01-fcbc86e9cca0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 3.891s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.068740] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.068740] env[59620]: DEBUG nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 714.117784] env[59620]: DEBUG nova.compute.utils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 714.119598] env[59620]: DEBUG nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 714.119792] env[59620]: DEBUG nova.network.neutron [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 714.131116] env[59620]: DEBUG nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 714.220865] env[59620]: DEBUG nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 714.245273] env[59620]: DEBUG nova.virt.hardware [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 714.245273] env[59620]: DEBUG nova.virt.hardware [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 714.245273] env[59620]: DEBUG nova.virt.hardware [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 714.245471] env[59620]: DEBUG nova.virt.hardware [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 714.245508] env[59620]: DEBUG nova.virt.hardware [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 714.245798] env[59620]: DEBUG nova.virt.hardware [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 714.245871] env[59620]: DEBUG nova.virt.hardware [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 714.245986] env[59620]: DEBUG nova.virt.hardware [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 714.246159] env[59620]: DEBUG nova.virt.hardware [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 714.246316] env[59620]: DEBUG nova.virt.hardware [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 714.246483] env[59620]: DEBUG nova.virt.hardware [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 714.247360] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b64b5a5-b2c2-4aa0-91b1-3825077854c0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.257039] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "449cb7ea-c7e9-411c-9c09-f451d892d32c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.257460] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "449cb7ea-c7e9-411c-9c09-f451d892d32c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.260313] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6984d1b3-cfea-4763-85ce-87da8ee0bc07 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.266689] env[59620]: DEBUG nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 714.293333] env[59620]: DEBUG nova.policy [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f40b2254538745bf9406408a158a55fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bd8ec082967e4839810b492d2af942a6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 714.330974] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.331242] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.332795] env[59620]: INFO nova.compute.claims [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 714.372516] env[59620]: DEBUG nova.network.neutron [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 714.383130] env[59620]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Releasing lock "refresh_cache-e89d07fc-9c98-4352-b609-c7fde7ee0d39" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 714.383510] env[59620]: DEBUG nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 714.383695] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 714.384162] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-51ff2b05-9554-4e02-8998-40cde469112e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.393142] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dac0831-6fac-477a-80cb-dee2945b24cf {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.419445] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e89d07fc-9c98-4352-b609-c7fde7ee0d39 could not be found. [ 714.419653] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 714.419820] env[59620]: INFO nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Took 0.04 seconds to destroy the instance on the hypervisor. [ 714.420084] env[59620]: DEBUG oslo.service.loopingcall [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 714.425075] env[59620]: DEBUG nova.compute.manager [-] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 714.425075] env[59620]: DEBUG nova.network.neutron [-] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 714.482170] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f61f7a4b-feba-4731-bfcd-619725e7ff74 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.485219] env[59620]: DEBUG nova.network.neutron [-] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 714.491601] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f62c15cf-ff91-49ee-8b3b-dc6fc74c7692 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.495791] env[59620]: DEBUG nova.network.neutron [-] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 714.523599] env[59620]: INFO nova.compute.manager [-] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Took 0.10 seconds to deallocate network for instance. [ 714.524345] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32d7a9b4-c3b2-4a04-b1f2-425c0dbca9b1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.528680] env[59620]: DEBUG nova.compute.claims [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 714.528844] env[59620]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 714.534265] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97432d47-979c-4ff0-b375-c107a033a983 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.548543] env[59620]: DEBUG nova.compute.provider_tree [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 714.556529] env[59620]: DEBUG nova.scheduler.client.report [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 714.569546] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.238s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.570054] env[59620]: DEBUG nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 714.572714] env[59620]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.044s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.613020] env[59620]: DEBUG nova.compute.utils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 714.616095] env[59620]: DEBUG nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 714.617466] env[59620]: DEBUG nova.network.neutron [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 714.625021] env[59620]: DEBUG nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 714.709285] env[59620]: DEBUG nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 714.722995] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3637d208-2dec-479a-bed4-aa063c7abf30 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.734378] env[59620]: DEBUG nova.virt.hardware [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 714.734639] env[59620]: DEBUG nova.virt.hardware [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 714.734781] env[59620]: DEBUG nova.virt.hardware [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 714.734952] env[59620]: DEBUG nova.virt.hardware [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 714.735105] env[59620]: DEBUG nova.virt.hardware [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 714.735272] env[59620]: DEBUG nova.virt.hardware [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 714.735479] env[59620]: DEBUG nova.virt.hardware [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 714.735708] env[59620]: DEBUG nova.virt.hardware [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 714.736588] env[59620]: DEBUG nova.virt.hardware [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 714.736588] env[59620]: DEBUG nova.virt.hardware [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 714.736588] env[59620]: DEBUG nova.virt.hardware [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 714.737522] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80600e6f-5144-4cb6-a144-4b76d06b5573 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.741984] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efcbd2b5-08cb-4d0c-b9de-3f32821661b0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.780227] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50b2c6cd-fd4b-4704-975c-d729c687765a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.786163] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-561f9862-cb02-478d-af4b-dd569b33c90e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.802517] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69b5eb1d-31a9-47b3-a3dd-ac0ab49004df {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.817392] env[59620]: DEBUG nova.compute.provider_tree [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 714.826888] env[59620]: DEBUG nova.scheduler.client.report [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 714.840764] env[59620]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.268s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.841867] env[59620]: ERROR nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 913bdb45-f543-4a9e-8e8c-d569876ac3b1, please check neutron logs for more information. [ 714.841867] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Traceback (most recent call last): [ 714.841867] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 714.841867] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] self.driver.spawn(context, instance, image_meta, [ 714.841867] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 714.841867] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] self._vmops.spawn(context, instance, image_meta, injected_files, [ 714.841867] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 714.841867] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] vm_ref = self.build_virtual_machine(instance, [ 714.841867] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 714.841867] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] vif_infos = vmwarevif.get_vif_info(self._session, [ 714.841867] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 714.842361] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] for vif in network_info: [ 714.842361] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 714.842361] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] return self._sync_wrapper(fn, *args, **kwargs) [ 714.842361] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 714.842361] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] self.wait() [ 714.842361] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 714.842361] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] self[:] = self._gt.wait() [ 714.842361] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 714.842361] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] return self._exit_event.wait() [ 714.842361] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 714.842361] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] result = hub.switch() [ 714.842361] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 714.842361] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] return self.greenlet.switch() [ 714.842361] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 714.842693] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] result = function(*args, **kwargs) [ 714.842693] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 714.842693] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] return func(*args, **kwargs) [ 714.842693] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 714.842693] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] raise e [ 714.842693] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 714.842693] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] nwinfo = self.network_api.allocate_for_instance( [ 714.842693] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 714.842693] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] created_port_ids = self._update_ports_for_instance( [ 714.842693] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 714.842693] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] with excutils.save_and_reraise_exception(): [ 714.842693] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 714.842693] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] self.force_reraise() [ 714.842980] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 714.842980] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] raise self.value [ 714.842980] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 714.842980] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] updated_port = self._update_port( [ 714.842980] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 714.842980] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] _ensure_no_port_binding_failure(port) [ 714.842980] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 714.842980] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] raise exception.PortBindingFailed(port_id=port['id']) [ 714.842980] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] nova.exception.PortBindingFailed: Binding failed for port 913bdb45-f543-4a9e-8e8c-d569876ac3b1, please check neutron logs for more information. [ 714.842980] env[59620]: ERROR nova.compute.manager [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] [ 714.842980] env[59620]: DEBUG nova.compute.utils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Binding failed for port 913bdb45-f543-4a9e-8e8c-d569876ac3b1, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 714.844275] env[59620]: DEBUG nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Build of instance e89d07fc-9c98-4352-b609-c7fde7ee0d39 was re-scheduled: Binding failed for port 913bdb45-f543-4a9e-8e8c-d569876ac3b1, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 714.844683] env[59620]: DEBUG nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 714.844901] env[59620]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Acquiring lock "refresh_cache-e89d07fc-9c98-4352-b609-c7fde7ee0d39" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 714.845060] env[59620]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Acquired lock "refresh_cache-e89d07fc-9c98-4352-b609-c7fde7ee0d39" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 714.845216] env[59620]: DEBUG nova.network.neutron [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 714.876726] env[59620]: DEBUG nova.policy [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6316c8b7da8d4d3c97b2693b33729c52', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cc4c8738af2b48f981e5f2feadb41a59', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 714.933355] env[59620]: DEBUG nova.network.neutron [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 715.378147] env[59620]: DEBUG nova.network.neutron [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Successfully created port: 02d27e3c-0790-4e86-b3f1-9cf06e423758 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 715.475411] env[59620]: DEBUG nova.network.neutron [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.484544] env[59620]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Releasing lock "refresh_cache-e89d07fc-9c98-4352-b609-c7fde7ee0d39" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 715.484758] env[59620]: DEBUG nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 715.484913] env[59620]: DEBUG nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 715.485108] env[59620]: DEBUG nova.network.neutron [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 715.553499] env[59620]: DEBUG nova.network.neutron [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 715.560863] env[59620]: DEBUG nova.network.neutron [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 715.574119] env[59620]: INFO nova.compute.manager [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] [instance: e89d07fc-9c98-4352-b609-c7fde7ee0d39] Took 0.09 seconds to deallocate network for instance. [ 715.684545] env[59620]: INFO nova.scheduler.client.report [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Deleted allocations for instance e89d07fc-9c98-4352-b609-c7fde7ee0d39 [ 715.703173] env[59620]: DEBUG oslo_concurrency.lockutils [None req-3c50ac49-812f-40fa-97c2-94d4f40a811a tempest-ServerExternalEventsTest-180751548 tempest-ServerExternalEventsTest-180751548-project-member] Lock "e89d07fc-9c98-4352-b609-c7fde7ee0d39" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.669s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 715.911999] env[59620]: ERROR nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 50df6311-769b-4b6b-9bbd-76405db8df9b, please check neutron logs for more information. [ 715.911999] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 715.911999] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 715.911999] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 715.911999] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 715.911999] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 715.911999] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 715.911999] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 715.911999] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 715.911999] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 715.911999] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 715.911999] env[59620]: ERROR nova.compute.manager raise self.value [ 715.911999] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 715.911999] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 715.911999] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 715.911999] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 715.912951] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 715.912951] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 715.912951] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 50df6311-769b-4b6b-9bbd-76405db8df9b, please check neutron logs for more information. [ 715.912951] env[59620]: ERROR nova.compute.manager [ 715.912951] env[59620]: Traceback (most recent call last): [ 715.912951] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 715.912951] env[59620]: listener.cb(fileno) [ 715.912951] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 715.912951] env[59620]: result = function(*args, **kwargs) [ 715.912951] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 715.912951] env[59620]: return func(*args, **kwargs) [ 715.912951] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 715.912951] env[59620]: raise e [ 715.912951] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 715.912951] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 715.912951] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 715.912951] env[59620]: created_port_ids = self._update_ports_for_instance( [ 715.912951] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 715.912951] env[59620]: with excutils.save_and_reraise_exception(): [ 715.912951] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 715.912951] env[59620]: self.force_reraise() [ 715.912951] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 715.912951] env[59620]: raise self.value [ 715.912951] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 715.912951] env[59620]: updated_port = self._update_port( [ 715.912951] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 715.912951] env[59620]: _ensure_no_port_binding_failure(port) [ 715.912951] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 715.912951] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 715.913692] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 50df6311-769b-4b6b-9bbd-76405db8df9b, please check neutron logs for more information. [ 715.913692] env[59620]: Removing descriptor: 14 [ 715.913692] env[59620]: ERROR nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 50df6311-769b-4b6b-9bbd-76405db8df9b, please check neutron logs for more information. [ 715.913692] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Traceback (most recent call last): [ 715.913692] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 715.913692] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] yield resources [ 715.913692] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 715.913692] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] self.driver.spawn(context, instance, image_meta, [ 715.913692] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 715.913692] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 715.913692] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 715.913692] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] vm_ref = self.build_virtual_machine(instance, [ 715.914022] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 715.914022] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] vif_infos = vmwarevif.get_vif_info(self._session, [ 715.914022] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 715.914022] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] for vif in network_info: [ 715.914022] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 715.914022] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] return self._sync_wrapper(fn, *args, **kwargs) [ 715.914022] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 715.914022] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] self.wait() [ 715.914022] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 715.914022] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] self[:] = self._gt.wait() [ 715.914022] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 715.914022] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] return self._exit_event.wait() [ 715.914022] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 715.914022] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] result = hub.switch() [ 715.914395] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 715.914395] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] return self.greenlet.switch() [ 715.914395] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 715.914395] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] result = function(*args, **kwargs) [ 715.914395] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 715.914395] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] return func(*args, **kwargs) [ 715.914395] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 715.914395] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] raise e [ 715.914395] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 715.914395] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] nwinfo = self.network_api.allocate_for_instance( [ 715.914395] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 715.914395] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] created_port_ids = self._update_ports_for_instance( [ 715.914395] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 715.914732] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] with excutils.save_and_reraise_exception(): [ 715.914732] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 715.914732] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] self.force_reraise() [ 715.914732] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 715.914732] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] raise self.value [ 715.914732] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 715.914732] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] updated_port = self._update_port( [ 715.914732] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 715.914732] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] _ensure_no_port_binding_failure(port) [ 715.914732] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 715.914732] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] raise exception.PortBindingFailed(port_id=port['id']) [ 715.914732] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] nova.exception.PortBindingFailed: Binding failed for port 50df6311-769b-4b6b-9bbd-76405db8df9b, please check neutron logs for more information. [ 715.914732] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] [ 715.915976] env[59620]: INFO nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Terminating instance [ 715.916965] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "refresh_cache-6fdadbc2-14e5-440f-aba2-4db693f56de6" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 715.917277] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquired lock "refresh_cache-6fdadbc2-14e5-440f-aba2-4db693f56de6" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 715.918373] env[59620]: DEBUG nova.network.neutron [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 716.018657] env[59620]: DEBUG nova.network.neutron [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 716.931167] env[59620]: DEBUG nova.network.neutron [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.945578] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Releasing lock "refresh_cache-6fdadbc2-14e5-440f-aba2-4db693f56de6" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 716.946506] env[59620]: DEBUG nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 716.946506] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 716.946756] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3513807f-03f7-4dc6-a475-795bdc6f491d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.965704] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0721741e-a5d3-417e-9159-b9e0e3b3de21 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.988277] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6fdadbc2-14e5-440f-aba2-4db693f56de6 could not be found. [ 716.988544] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 716.988729] env[59620]: INFO nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Took 0.04 seconds to destroy the instance on the hypervisor. [ 716.989008] env[59620]: DEBUG oslo.service.loopingcall [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 716.989214] env[59620]: DEBUG nova.compute.manager [-] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 716.989309] env[59620]: DEBUG nova.network.neutron [-] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 717.078188] env[59620]: DEBUG nova.network.neutron [-] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 717.088080] env[59620]: DEBUG nova.network.neutron [-] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 717.100073] env[59620]: INFO nova.compute.manager [-] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Took 0.11 seconds to deallocate network for instance. [ 717.103651] env[59620]: DEBUG nova.compute.claims [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 717.103827] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 717.104054] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 717.160325] env[59620]: DEBUG nova.network.neutron [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Successfully created port: 4ca49a09-f39b-4b5d-8ffe-428938f9486e {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 717.240225] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86e155e8-9721-49c1-8e65-b380e1a6a905 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.247633] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-480c10f1-9e35-48f1-95f5-0c4ee5e6c725 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.293105] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6647ad22-1913-44f4-8f52-be0e29875c70 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.301940] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0942693-8bf9-4020-ac25-d296684336c3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.317155] env[59620]: DEBUG nova.compute.provider_tree [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 717.331852] env[59620]: DEBUG nova.scheduler.client.report [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 717.350134] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.246s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 717.350812] env[59620]: ERROR nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 50df6311-769b-4b6b-9bbd-76405db8df9b, please check neutron logs for more information. [ 717.350812] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Traceback (most recent call last): [ 717.350812] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 717.350812] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] self.driver.spawn(context, instance, image_meta, [ 717.350812] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 717.350812] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 717.350812] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 717.350812] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] vm_ref = self.build_virtual_machine(instance, [ 717.350812] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 717.350812] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] vif_infos = vmwarevif.get_vif_info(self._session, [ 717.350812] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 717.351173] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] for vif in network_info: [ 717.351173] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 717.351173] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] return self._sync_wrapper(fn, *args, **kwargs) [ 717.351173] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 717.351173] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] self.wait() [ 717.351173] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 717.351173] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] self[:] = self._gt.wait() [ 717.351173] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 717.351173] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] return self._exit_event.wait() [ 717.351173] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 717.351173] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] result = hub.switch() [ 717.351173] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 717.351173] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] return self.greenlet.switch() [ 717.351173] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 717.351500] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] result = function(*args, **kwargs) [ 717.351500] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 717.351500] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] return func(*args, **kwargs) [ 717.351500] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 717.351500] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] raise e [ 717.351500] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 717.351500] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] nwinfo = self.network_api.allocate_for_instance( [ 717.351500] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 717.351500] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] created_port_ids = self._update_ports_for_instance( [ 717.351500] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 717.351500] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] with excutils.save_and_reraise_exception(): [ 717.351500] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 717.351500] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] self.force_reraise() [ 717.351809] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 717.351809] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] raise self.value [ 717.351809] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 717.351809] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] updated_port = self._update_port( [ 717.351809] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 717.351809] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] _ensure_no_port_binding_failure(port) [ 717.351809] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 717.351809] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] raise exception.PortBindingFailed(port_id=port['id']) [ 717.351809] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] nova.exception.PortBindingFailed: Binding failed for port 50df6311-769b-4b6b-9bbd-76405db8df9b, please check neutron logs for more information. [ 717.351809] env[59620]: ERROR nova.compute.manager [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] [ 717.351809] env[59620]: DEBUG nova.compute.utils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Binding failed for port 50df6311-769b-4b6b-9bbd-76405db8df9b, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 717.353764] env[59620]: DEBUG nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Build of instance 6fdadbc2-14e5-440f-aba2-4db693f56de6 was re-scheduled: Binding failed for port 50df6311-769b-4b6b-9bbd-76405db8df9b, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 717.354605] env[59620]: DEBUG nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 717.354605] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquiring lock "refresh_cache-6fdadbc2-14e5-440f-aba2-4db693f56de6" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 717.354605] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Acquired lock "refresh_cache-6fdadbc2-14e5-440f-aba2-4db693f56de6" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 717.354746] env[59620]: DEBUG nova.network.neutron [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 717.645532] env[59620]: DEBUG nova.network.neutron [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 718.303318] env[59620]: ERROR nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port f55d6baf-cc51-4f55-88ec-78d6f8b9c411, please check neutron logs for more information. [ 718.303318] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 718.303318] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 718.303318] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 718.303318] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 718.303318] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 718.303318] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 718.303318] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 718.303318] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 718.303318] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 718.303318] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 718.303318] env[59620]: ERROR nova.compute.manager raise self.value [ 718.303318] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 718.303318] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 718.303318] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 718.303318] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 718.304031] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 718.304031] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 718.304031] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port f55d6baf-cc51-4f55-88ec-78d6f8b9c411, please check neutron logs for more information. [ 718.304031] env[59620]: ERROR nova.compute.manager [ 718.304031] env[59620]: Traceback (most recent call last): [ 718.304031] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 718.304031] env[59620]: listener.cb(fileno) [ 718.304031] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 718.304031] env[59620]: result = function(*args, **kwargs) [ 718.304031] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 718.304031] env[59620]: return func(*args, **kwargs) [ 718.304031] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 718.304031] env[59620]: raise e [ 718.304031] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 718.304031] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 718.304031] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 718.304031] env[59620]: created_port_ids = self._update_ports_for_instance( [ 718.304031] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 718.304031] env[59620]: with excutils.save_and_reraise_exception(): [ 718.304031] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 718.304031] env[59620]: self.force_reraise() [ 718.304031] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 718.304031] env[59620]: raise self.value [ 718.304031] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 718.304031] env[59620]: updated_port = self._update_port( [ 718.304031] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 718.304031] env[59620]: _ensure_no_port_binding_failure(port) [ 718.304031] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 718.304031] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 718.304850] env[59620]: nova.exception.PortBindingFailed: Binding failed for port f55d6baf-cc51-4f55-88ec-78d6f8b9c411, please check neutron logs for more information. [ 718.304850] env[59620]: Removing descriptor: 15 [ 718.304850] env[59620]: ERROR nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port f55d6baf-cc51-4f55-88ec-78d6f8b9c411, please check neutron logs for more information. [ 718.304850] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Traceback (most recent call last): [ 718.304850] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 718.304850] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] yield resources [ 718.304850] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 718.304850] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] self.driver.spawn(context, instance, image_meta, [ 718.304850] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 718.304850] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 718.304850] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 718.304850] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] vm_ref = self.build_virtual_machine(instance, [ 718.305258] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 718.305258] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] vif_infos = vmwarevif.get_vif_info(self._session, [ 718.305258] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 718.305258] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] for vif in network_info: [ 718.305258] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 718.305258] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] return self._sync_wrapper(fn, *args, **kwargs) [ 718.305258] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 718.305258] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] self.wait() [ 718.305258] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 718.305258] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] self[:] = self._gt.wait() [ 718.305258] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 718.305258] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] return self._exit_event.wait() [ 718.305258] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 718.305258] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] result = hub.switch() [ 718.305654] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 718.305654] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] return self.greenlet.switch() [ 718.305654] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 718.305654] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] result = function(*args, **kwargs) [ 718.305654] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 718.305654] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] return func(*args, **kwargs) [ 718.305654] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 718.305654] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] raise e [ 718.305654] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 718.305654] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] nwinfo = self.network_api.allocate_for_instance( [ 718.305654] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 718.305654] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] created_port_ids = self._update_ports_for_instance( [ 718.305654] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 718.306030] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] with excutils.save_and_reraise_exception(): [ 718.306030] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 718.306030] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] self.force_reraise() [ 718.306030] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 718.306030] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] raise self.value [ 718.306030] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 718.306030] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] updated_port = self._update_port( [ 718.306030] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 718.306030] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] _ensure_no_port_binding_failure(port) [ 718.306030] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 718.306030] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] raise exception.PortBindingFailed(port_id=port['id']) [ 718.306030] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] nova.exception.PortBindingFailed: Binding failed for port f55d6baf-cc51-4f55-88ec-78d6f8b9c411, please check neutron logs for more information. [ 718.306030] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] [ 718.306389] env[59620]: INFO nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Terminating instance [ 718.313435] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Acquiring lock "refresh_cache-73b2fd88-ded1-4a92-a973-6a49e57faa5e" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 718.313435] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Acquired lock "refresh_cache-73b2fd88-ded1-4a92-a973-6a49e57faa5e" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 718.313435] env[59620]: DEBUG nova.network.neutron [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 718.449153] env[59620]: DEBUG nova.network.neutron [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 718.497646] env[59620]: DEBUG nova.network.neutron [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.515181] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Releasing lock "refresh_cache-6fdadbc2-14e5-440f-aba2-4db693f56de6" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 718.515181] env[59620]: DEBUG nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 718.515181] env[59620]: DEBUG nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 718.515181] env[59620]: DEBUG nova.network.neutron [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 718.798570] env[59620]: DEBUG nova.network.neutron [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 718.806961] env[59620]: DEBUG nova.network.neutron [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.815404] env[59620]: INFO nova.compute.manager [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] [instance: 6fdadbc2-14e5-440f-aba2-4db693f56de6] Took 0.30 seconds to deallocate network for instance. [ 718.902885] env[59620]: INFO nova.scheduler.client.report [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Deleted allocations for instance 6fdadbc2-14e5-440f-aba2-4db693f56de6 [ 718.935720] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fddcb85f-3a85-4025-b855-feae0a7344e7 tempest-ListServerFiltersTestJSON-1569189728 tempest-ListServerFiltersTestJSON-1569189728-project-member] Lock "6fdadbc2-14e5-440f-aba2-4db693f56de6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.854s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 719.353663] env[59620]: DEBUG nova.network.neutron [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.365050] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Releasing lock "refresh_cache-73b2fd88-ded1-4a92-a973-6a49e57faa5e" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 719.365050] env[59620]: DEBUG nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 719.365050] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 719.365050] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-99158f6d-9877-43d3-b304-f2200c499ea0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.377942] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96a8d500-dfbe-4b71-bc56-b83fd9d18c1f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.409299] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 73b2fd88-ded1-4a92-a973-6a49e57faa5e could not be found. [ 719.410030] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 719.410030] env[59620]: INFO nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Took 0.05 seconds to destroy the instance on the hypervisor. [ 719.410183] env[59620]: DEBUG oslo.service.loopingcall [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 719.410504] env[59620]: DEBUG nova.compute.manager [-] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 719.410504] env[59620]: DEBUG nova.network.neutron [-] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 719.504955] env[59620]: DEBUG nova.network.neutron [-] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 719.511806] env[59620]: DEBUG nova.network.neutron [-] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.523281] env[59620]: INFO nova.compute.manager [-] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Took 0.11 seconds to deallocate network for instance. [ 719.525829] env[59620]: DEBUG nova.compute.claims [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 719.525829] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 719.525829] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 719.641391] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-763e1cd7-b81a-4549-9019-a8b77c29f83e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.650904] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b98c371-9bd0-428f-a3e3-f868af2b49e6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.687492] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac9c2d2a-8b0a-48f3-b5e3-54690bbd13ff {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.695521] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c53e2ff6-0916-4dc4-afb3-06b4c07d5dd9 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.711499] env[59620]: DEBUG nova.compute.provider_tree [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 719.720666] env[59620]: DEBUG nova.scheduler.client.report [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 719.737440] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.212s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 719.738663] env[59620]: ERROR nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port f55d6baf-cc51-4f55-88ec-78d6f8b9c411, please check neutron logs for more information. [ 719.738663] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Traceback (most recent call last): [ 719.738663] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 719.738663] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] self.driver.spawn(context, instance, image_meta, [ 719.738663] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 719.738663] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 719.738663] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 719.738663] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] vm_ref = self.build_virtual_machine(instance, [ 719.738663] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 719.738663] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] vif_infos = vmwarevif.get_vif_info(self._session, [ 719.738663] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 719.739047] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] for vif in network_info: [ 719.739047] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 719.739047] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] return self._sync_wrapper(fn, *args, **kwargs) [ 719.739047] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 719.739047] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] self.wait() [ 719.739047] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 719.739047] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] self[:] = self._gt.wait() [ 719.739047] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 719.739047] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] return self._exit_event.wait() [ 719.739047] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 719.739047] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] result = hub.switch() [ 719.739047] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 719.739047] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] return self.greenlet.switch() [ 719.739047] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 719.739479] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] result = function(*args, **kwargs) [ 719.739479] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 719.739479] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] return func(*args, **kwargs) [ 719.739479] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 719.739479] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] raise e [ 719.739479] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 719.739479] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] nwinfo = self.network_api.allocate_for_instance( [ 719.739479] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 719.739479] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] created_port_ids = self._update_ports_for_instance( [ 719.739479] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 719.739479] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] with excutils.save_and_reraise_exception(): [ 719.739479] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 719.739479] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] self.force_reraise() [ 719.739794] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 719.739794] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] raise self.value [ 719.739794] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 719.739794] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] updated_port = self._update_port( [ 719.739794] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 719.739794] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] _ensure_no_port_binding_failure(port) [ 719.739794] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 719.739794] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] raise exception.PortBindingFailed(port_id=port['id']) [ 719.739794] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] nova.exception.PortBindingFailed: Binding failed for port f55d6baf-cc51-4f55-88ec-78d6f8b9c411, please check neutron logs for more information. [ 719.739794] env[59620]: ERROR nova.compute.manager [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] [ 719.739794] env[59620]: DEBUG nova.compute.utils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Binding failed for port f55d6baf-cc51-4f55-88ec-78d6f8b9c411, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 719.742929] env[59620]: DEBUG nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Build of instance 73b2fd88-ded1-4a92-a973-6a49e57faa5e was re-scheduled: Binding failed for port f55d6baf-cc51-4f55-88ec-78d6f8b9c411, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 719.742929] env[59620]: DEBUG nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 719.742929] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Acquiring lock "refresh_cache-73b2fd88-ded1-4a92-a973-6a49e57faa5e" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 719.742929] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Acquired lock "refresh_cache-73b2fd88-ded1-4a92-a973-6a49e57faa5e" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 719.743208] env[59620]: DEBUG nova.network.neutron [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 719.965322] env[59620]: DEBUG nova.network.neutron [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 720.268629] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "9999064a-7edc-4e2c-92fb-7a713194764c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.269135] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "9999064a-7edc-4e2c-92fb-7a713194764c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.282034] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 720.296114] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "a1f06df7-a38c-431d-98ee-7b3df8224ea1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.296335] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "a1f06df7-a38c-431d-98ee-7b3df8224ea1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.306907] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 720.350392] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.350636] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.352056] env[59620]: INFO nova.compute.claims [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 720.367445] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.528516] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e39b1b4-dd5b-4870-9090-23b158a9c466 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.538227] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-765b5f79-7aef-41e3-8a2f-14d039c0d171 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.572169] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8badb8cb-b839-45e0-aaf9-860ff41ba8e8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.579562] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0594900d-580f-43c9-bf4e-9b6d18c12d3a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.594988] env[59620]: DEBUG nova.compute.provider_tree [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 720.605440] env[59620]: DEBUG nova.scheduler.client.report [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 720.622669] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.622669] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 720.625211] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.257s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.626673] env[59620]: INFO nova.compute.claims [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 720.660086] env[59620]: DEBUG nova.compute.utils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 720.660318] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 720.660585] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 720.669660] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 720.737985] env[59620]: DEBUG nova.network.neutron [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 720.745902] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 720.748386] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Releasing lock "refresh_cache-73b2fd88-ded1-4a92-a973-6a49e57faa5e" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 720.748763] env[59620]: DEBUG nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 720.748971] env[59620]: DEBUG nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 720.749178] env[59620]: DEBUG nova.network.neutron [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 720.768579] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 720.768878] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 720.769077] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 720.769342] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 720.769528] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 720.769704] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 720.769940] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 720.770479] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 720.770479] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 720.770479] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 720.770695] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 720.771723] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eac45395-c11e-44d1-9d25-788d9dbc794b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.776253] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc60bbb4-1fef-4f91-bdb8-6f7aa5629f89 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.781821] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e02cf386-1633-4937-b529-dc129862e837 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.787803] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbdf5980-0662-48f1-a953-d47cb4279f14 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.824714] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6549338-2610-4e76-aa1b-6dc25200c80c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.832317] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03f6a21b-561c-4307-9f5d-cd3e3d7b377a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.846260] env[59620]: DEBUG nova.compute.provider_tree [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 720.854415] env[59620]: DEBUG nova.scheduler.client.report [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 720.869818] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.245s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.871314] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 720.913666] env[59620]: DEBUG nova.compute.utils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 720.914918] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 720.915097] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 720.924413] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 720.987082] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 721.010902] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 721.010902] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 721.010902] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 721.011144] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 721.011265] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 721.011349] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 721.011549] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 721.012127] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 721.012320] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 721.012485] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 721.012654] env[59620]: DEBUG nova.virt.hardware [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 721.013509] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f7e045a-8291-458f-8675-00729568eb14 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.022283] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6d0aae6-76fd-406c-a3f6-02475de8b984 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.179126] env[59620]: DEBUG nova.network.neutron [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 721.187048] env[59620]: DEBUG nova.network.neutron [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.196355] env[59620]: INFO nova.compute.manager [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] [instance: 73b2fd88-ded1-4a92-a973-6a49e57faa5e] Took 0.45 seconds to deallocate network for instance. [ 721.287453] env[59620]: INFO nova.scheduler.client.report [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Deleted allocations for instance 73b2fd88-ded1-4a92-a973-6a49e57faa5e [ 721.303124] env[59620]: DEBUG oslo_concurrency.lockutils [None req-219b4615-33d5-49f6-afd0-48db5f40eacd tempest-ServerGroupTestJSON-1233429301 tempest-ServerGroupTestJSON-1233429301-project-member] Lock "73b2fd88-ded1-4a92-a973-6a49e57faa5e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.670s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.373638] env[59620]: DEBUG nova.policy [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfbe2267ded84c71b3af181cb852d581', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1de6a55b95aa4af2865ec70142a20326', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 721.575935] env[59620]: DEBUG nova.policy [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfbe2267ded84c71b3af181cb852d581', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1de6a55b95aa4af2865ec70142a20326', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 722.599262] env[59620]: ERROR nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port a5869fae-2063-4a18-99cd-ceda0844e417, please check neutron logs for more information. [ 722.599262] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 722.599262] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 722.599262] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 722.599262] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 722.599262] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 722.599262] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 722.599262] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 722.599262] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 722.599262] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 722.599262] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 722.599262] env[59620]: ERROR nova.compute.manager raise self.value [ 722.599262] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 722.599262] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 722.599262] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 722.599262] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 722.599988] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 722.599988] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 722.599988] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port a5869fae-2063-4a18-99cd-ceda0844e417, please check neutron logs for more information. [ 722.599988] env[59620]: ERROR nova.compute.manager [ 722.599988] env[59620]: Traceback (most recent call last): [ 722.599988] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 722.599988] env[59620]: listener.cb(fileno) [ 722.599988] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 722.599988] env[59620]: result = function(*args, **kwargs) [ 722.599988] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 722.599988] env[59620]: return func(*args, **kwargs) [ 722.599988] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 722.599988] env[59620]: raise e [ 722.599988] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 722.599988] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 722.599988] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 722.599988] env[59620]: created_port_ids = self._update_ports_for_instance( [ 722.599988] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 722.599988] env[59620]: with excutils.save_and_reraise_exception(): [ 722.599988] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 722.599988] env[59620]: self.force_reraise() [ 722.599988] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 722.599988] env[59620]: raise self.value [ 722.599988] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 722.599988] env[59620]: updated_port = self._update_port( [ 722.599988] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 722.599988] env[59620]: _ensure_no_port_binding_failure(port) [ 722.599988] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 722.599988] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 722.600880] env[59620]: nova.exception.PortBindingFailed: Binding failed for port a5869fae-2063-4a18-99cd-ceda0844e417, please check neutron logs for more information. [ 722.600880] env[59620]: Removing descriptor: 22 [ 722.600880] env[59620]: ERROR nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port a5869fae-2063-4a18-99cd-ceda0844e417, please check neutron logs for more information. [ 722.600880] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Traceback (most recent call last): [ 722.600880] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 722.600880] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] yield resources [ 722.600880] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 722.600880] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] self.driver.spawn(context, instance, image_meta, [ 722.600880] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 722.600880] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 722.600880] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 722.600880] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] vm_ref = self.build_virtual_machine(instance, [ 722.601276] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 722.601276] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] vif_infos = vmwarevif.get_vif_info(self._session, [ 722.601276] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 722.601276] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] for vif in network_info: [ 722.601276] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 722.601276] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] return self._sync_wrapper(fn, *args, **kwargs) [ 722.601276] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 722.601276] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] self.wait() [ 722.601276] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 722.601276] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] self[:] = self._gt.wait() [ 722.601276] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 722.601276] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] return self._exit_event.wait() [ 722.601276] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 722.601276] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] result = hub.switch() [ 722.601712] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 722.601712] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] return self.greenlet.switch() [ 722.601712] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 722.601712] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] result = function(*args, **kwargs) [ 722.601712] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 722.601712] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] return func(*args, **kwargs) [ 722.601712] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 722.601712] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] raise e [ 722.601712] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 722.601712] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] nwinfo = self.network_api.allocate_for_instance( [ 722.601712] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 722.601712] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] created_port_ids = self._update_ports_for_instance( [ 722.601712] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 722.602099] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] with excutils.save_and_reraise_exception(): [ 722.602099] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 722.602099] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] self.force_reraise() [ 722.602099] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 722.602099] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] raise self.value [ 722.602099] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 722.602099] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] updated_port = self._update_port( [ 722.602099] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 722.602099] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] _ensure_no_port_binding_failure(port) [ 722.602099] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 722.602099] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] raise exception.PortBindingFailed(port_id=port['id']) [ 722.602099] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] nova.exception.PortBindingFailed: Binding failed for port a5869fae-2063-4a18-99cd-ceda0844e417, please check neutron logs for more information. [ 722.602099] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] [ 722.602482] env[59620]: INFO nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Terminating instance [ 722.604458] env[59620]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Acquiring lock "refresh_cache-21f0c0f4-9fbd-4d34-be36-6dae6538bf9c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 722.604615] env[59620]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Acquired lock "refresh_cache-21f0c0f4-9fbd-4d34-be36-6dae6538bf9c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 722.604782] env[59620]: DEBUG nova.network.neutron [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 722.715797] env[59620]: DEBUG nova.network.neutron [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 723.102967] env[59620]: ERROR nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 02d27e3c-0790-4e86-b3f1-9cf06e423758, please check neutron logs for more information. [ 723.102967] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 723.102967] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 723.102967] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 723.102967] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 723.102967] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 723.102967] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 723.102967] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 723.102967] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 723.102967] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 723.102967] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 723.102967] env[59620]: ERROR nova.compute.manager raise self.value [ 723.102967] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 723.102967] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 723.102967] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 723.102967] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 723.103441] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 723.103441] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 723.103441] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 02d27e3c-0790-4e86-b3f1-9cf06e423758, please check neutron logs for more information. [ 723.103441] env[59620]: ERROR nova.compute.manager [ 723.103441] env[59620]: Traceback (most recent call last): [ 723.103441] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 723.103441] env[59620]: listener.cb(fileno) [ 723.103441] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 723.103441] env[59620]: result = function(*args, **kwargs) [ 723.103441] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 723.103441] env[59620]: return func(*args, **kwargs) [ 723.103441] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 723.103441] env[59620]: raise e [ 723.103441] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 723.103441] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 723.103441] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 723.103441] env[59620]: created_port_ids = self._update_ports_for_instance( [ 723.103441] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 723.103441] env[59620]: with excutils.save_and_reraise_exception(): [ 723.103441] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 723.103441] env[59620]: self.force_reraise() [ 723.103441] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 723.103441] env[59620]: raise self.value [ 723.103441] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 723.103441] env[59620]: updated_port = self._update_port( [ 723.103441] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 723.103441] env[59620]: _ensure_no_port_binding_failure(port) [ 723.103441] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 723.103441] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 723.104265] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 02d27e3c-0790-4e86-b3f1-9cf06e423758, please check neutron logs for more information. [ 723.104265] env[59620]: Removing descriptor: 13 [ 723.104265] env[59620]: ERROR nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 02d27e3c-0790-4e86-b3f1-9cf06e423758, please check neutron logs for more information. [ 723.104265] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Traceback (most recent call last): [ 723.104265] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 723.104265] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] yield resources [ 723.104265] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 723.104265] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] self.driver.spawn(context, instance, image_meta, [ 723.104265] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 723.104265] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 723.104265] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 723.104265] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] vm_ref = self.build_virtual_machine(instance, [ 723.104590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 723.104590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] vif_infos = vmwarevif.get_vif_info(self._session, [ 723.104590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 723.104590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] for vif in network_info: [ 723.104590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 723.104590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] return self._sync_wrapper(fn, *args, **kwargs) [ 723.104590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 723.104590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] self.wait() [ 723.104590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 723.104590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] self[:] = self._gt.wait() [ 723.104590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 723.104590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] return self._exit_event.wait() [ 723.104590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 723.104590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] result = hub.switch() [ 723.104976] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 723.104976] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] return self.greenlet.switch() [ 723.104976] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 723.104976] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] result = function(*args, **kwargs) [ 723.104976] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 723.104976] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] return func(*args, **kwargs) [ 723.104976] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 723.104976] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] raise e [ 723.104976] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 723.104976] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] nwinfo = self.network_api.allocate_for_instance( [ 723.104976] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 723.104976] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] created_port_ids = self._update_ports_for_instance( [ 723.104976] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 723.105328] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] with excutils.save_and_reraise_exception(): [ 723.105328] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 723.105328] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] self.force_reraise() [ 723.105328] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 723.105328] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] raise self.value [ 723.105328] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 723.105328] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] updated_port = self._update_port( [ 723.105328] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 723.105328] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] _ensure_no_port_binding_failure(port) [ 723.105328] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 723.105328] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] raise exception.PortBindingFailed(port_id=port['id']) [ 723.105328] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] nova.exception.PortBindingFailed: Binding failed for port 02d27e3c-0790-4e86-b3f1-9cf06e423758, please check neutron logs for more information. [ 723.105328] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] [ 723.105634] env[59620]: INFO nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Terminating instance [ 723.106802] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "refresh_cache-2e990d70-8e51-4900-9d9d-db920311a8ab" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 723.106950] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquired lock "refresh_cache-2e990d70-8e51-4900-9d9d-db920311a8ab" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 723.107182] env[59620]: DEBUG nova.network.neutron [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 723.177549] env[59620]: DEBUG nova.network.neutron [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 723.239850] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "904eb823-1bb4-48b1-8460-c722cbc4652c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.240155] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "904eb823-1bb4-48b1-8460-c722cbc4652c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.255461] env[59620]: DEBUG nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 723.308166] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.308486] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.310271] env[59620]: INFO nova.compute.claims [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 723.487490] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddb8e7c5-ea5c-4eeb-916c-18ec79bb9c78 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.495730] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36fbaf76-ce1c-4cc3-80ef-c0e0a7ecee6d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.530987] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-158ba2a2-5360-4583-903e-063a1618e22e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.539417] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-905d62ff-3d3b-41e4-8575-b1492454a23f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.553535] env[59620]: DEBUG nova.compute.provider_tree [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 723.565037] env[59620]: DEBUG nova.scheduler.client.report [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 723.579285] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.271s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.579962] env[59620]: DEBUG nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 723.633112] env[59620]: DEBUG nova.compute.utils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 723.634300] env[59620]: DEBUG nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 723.634566] env[59620]: DEBUG nova.network.neutron [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 723.644459] env[59620]: DEBUG nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 723.724936] env[59620]: DEBUG nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 723.728604] env[59620]: DEBUG nova.network.neutron [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.739089] env[59620]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Releasing lock "refresh_cache-21f0c0f4-9fbd-4d34-be36-6dae6538bf9c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 723.739368] env[59620]: DEBUG nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 723.739547] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 723.740138] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fff87686-f2a6-46bc-836c-d0503c3dae0a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.750165] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d005aa1-e48f-4111-b9a7-e775c67682aa {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.782236] env[59620]: DEBUG nova.virt.hardware [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 723.782417] env[59620]: DEBUG nova.virt.hardware [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 723.782537] env[59620]: DEBUG nova.virt.hardware [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 723.782710] env[59620]: DEBUG nova.virt.hardware [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 723.782849] env[59620]: DEBUG nova.virt.hardware [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 723.782987] env[59620]: DEBUG nova.virt.hardware [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 723.783197] env[59620]: DEBUG nova.virt.hardware [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 723.783378] env[59620]: DEBUG nova.virt.hardware [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 723.783497] env[59620]: DEBUG nova.virt.hardware [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 723.783650] env[59620]: DEBUG nova.virt.hardware [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 723.783812] env[59620]: DEBUG nova.virt.hardware [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 723.784896] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-909f5c0c-663f-45ec-bbf8-a1525c5af455 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.791653] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c could not be found. [ 723.791847] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 723.792032] env[59620]: INFO nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Took 0.05 seconds to destroy the instance on the hypervisor. [ 723.792262] env[59620]: DEBUG oslo.service.loopingcall [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 723.792854] env[59620]: DEBUG nova.compute.manager [-] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 723.792986] env[59620]: DEBUG nova.network.neutron [-] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 723.798086] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc6ef7e8-78f0-472c-977f-4f3594ae0e6f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 723.880906] env[59620]: DEBUG nova.network.neutron [-] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 723.894867] env[59620]: DEBUG nova.network.neutron [-] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.904595] env[59620]: INFO nova.compute.manager [-] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Took 0.11 seconds to deallocate network for instance. [ 723.906744] env[59620]: DEBUG nova.compute.claims [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 723.908779] env[59620]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.909075] env[59620]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.002s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.053484] env[59620]: DEBUG nova.policy [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e41f52cd69cb4095bf97438cf32fc145', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '703100dfccf3406aa1580f39e87cdbb1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 724.056384] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f0ef70f-cd22-4248-92c1-686830b253d7 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.064987] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-002840a2-1467-4a01-8855-568154a9a926 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.101294] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83f9df3f-91b7-49db-bbf5-327cd003ea85 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.109518] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4adddff1-a2ac-4c49-a4db-fa64228dfcad {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.114492] env[59620]: DEBUG nova.network.neutron [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 724.125590] env[59620]: DEBUG nova.compute.provider_tree [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 724.129106] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Releasing lock "refresh_cache-2e990d70-8e51-4900-9d9d-db920311a8ab" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 724.129493] env[59620]: DEBUG nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 724.129809] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 724.130194] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-99aa9b11-574f-4ef3-856d-e4a7b4c7a76c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.138896] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5f53fd2-4e1d-4cea-a202-56f656510530 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.151338] env[59620]: DEBUG nova.scheduler.client.report [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 724.167906] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2e990d70-8e51-4900-9d9d-db920311a8ab could not be found. [ 724.167906] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 724.167906] env[59620]: INFO nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Took 0.04 seconds to destroy the instance on the hypervisor. [ 724.167906] env[59620]: DEBUG oslo.service.loopingcall [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 724.167906] env[59620]: DEBUG nova.compute.manager [-] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 724.168320] env[59620]: DEBUG nova.network.neutron [-] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 724.170369] env[59620]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.261s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.171028] env[59620]: ERROR nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port a5869fae-2063-4a18-99cd-ceda0844e417, please check neutron logs for more information. [ 724.171028] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Traceback (most recent call last): [ 724.171028] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 724.171028] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] self.driver.spawn(context, instance, image_meta, [ 724.171028] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 724.171028] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 724.171028] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 724.171028] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] vm_ref = self.build_virtual_machine(instance, [ 724.171028] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 724.171028] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] vif_infos = vmwarevif.get_vif_info(self._session, [ 724.171028] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 724.171381] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] for vif in network_info: [ 724.171381] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 724.171381] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] return self._sync_wrapper(fn, *args, **kwargs) [ 724.171381] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 724.171381] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] self.wait() [ 724.171381] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 724.171381] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] self[:] = self._gt.wait() [ 724.171381] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 724.171381] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] return self._exit_event.wait() [ 724.171381] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 724.171381] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] result = hub.switch() [ 724.171381] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 724.171381] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] return self.greenlet.switch() [ 724.171381] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 724.172166] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] result = function(*args, **kwargs) [ 724.172166] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 724.172166] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] return func(*args, **kwargs) [ 724.172166] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 724.172166] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] raise e [ 724.172166] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 724.172166] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] nwinfo = self.network_api.allocate_for_instance( [ 724.172166] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 724.172166] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] created_port_ids = self._update_ports_for_instance( [ 724.172166] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 724.172166] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] with excutils.save_and_reraise_exception(): [ 724.172166] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 724.172166] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] self.force_reraise() [ 724.172730] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 724.172730] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] raise self.value [ 724.172730] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 724.172730] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] updated_port = self._update_port( [ 724.172730] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 724.172730] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] _ensure_no_port_binding_failure(port) [ 724.172730] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 724.172730] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] raise exception.PortBindingFailed(port_id=port['id']) [ 724.172730] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] nova.exception.PortBindingFailed: Binding failed for port a5869fae-2063-4a18-99cd-ceda0844e417, please check neutron logs for more information. [ 724.172730] env[59620]: ERROR nova.compute.manager [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] [ 724.173162] env[59620]: DEBUG nova.compute.utils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Binding failed for port a5869fae-2063-4a18-99cd-ceda0844e417, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 724.173162] env[59620]: DEBUG nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Build of instance 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c was re-scheduled: Binding failed for port a5869fae-2063-4a18-99cd-ceda0844e417, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 724.174079] env[59620]: DEBUG nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 724.174079] env[59620]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Acquiring lock "refresh_cache-21f0c0f4-9fbd-4d34-be36-6dae6538bf9c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 724.174079] env[59620]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Acquired lock "refresh_cache-21f0c0f4-9fbd-4d34-be36-6dae6538bf9c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 724.174079] env[59620]: DEBUG nova.network.neutron [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 724.234465] env[59620]: DEBUG nova.network.neutron [-] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 724.243299] env[59620]: DEBUG nova.network.neutron [-] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 724.258170] env[59620]: INFO nova.compute.manager [-] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Took 0.09 seconds to deallocate network for instance. [ 724.262644] env[59620]: DEBUG nova.compute.claims [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 724.262834] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.263065] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.320017] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Successfully created port: c0cf9857-5476-4c79-8c6b-6afade2558c1 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 724.323147] env[59620]: DEBUG nova.network.neutron [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 724.426079] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1f8607c-650a-447b-9ec2-27d4790418f5 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.434585] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb2757eb-e40b-4cea-a2f8-62cb46608320 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.467828] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f50a1ac8-f14f-4b26-af2d-a27d3c89941f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.475997] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4566a720-f29d-4c28-b1d8-f22a879b59f7 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.494039] env[59620]: DEBUG nova.compute.provider_tree [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 724.500580] env[59620]: DEBUG nova.scheduler.client.report [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 724.518601] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.255s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.519356] env[59620]: ERROR nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 02d27e3c-0790-4e86-b3f1-9cf06e423758, please check neutron logs for more information. [ 724.519356] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Traceback (most recent call last): [ 724.519356] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 724.519356] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] self.driver.spawn(context, instance, image_meta, [ 724.519356] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 724.519356] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 724.519356] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 724.519356] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] vm_ref = self.build_virtual_machine(instance, [ 724.519356] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 724.519356] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] vif_infos = vmwarevif.get_vif_info(self._session, [ 724.519356] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 724.520337] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] for vif in network_info: [ 724.520337] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 724.520337] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] return self._sync_wrapper(fn, *args, **kwargs) [ 724.520337] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 724.520337] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] self.wait() [ 724.520337] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 724.520337] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] self[:] = self._gt.wait() [ 724.520337] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 724.520337] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] return self._exit_event.wait() [ 724.520337] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 724.520337] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] result = hub.switch() [ 724.520337] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 724.520337] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] return self.greenlet.switch() [ 724.520337] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 724.520948] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] result = function(*args, **kwargs) [ 724.520948] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 724.520948] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] return func(*args, **kwargs) [ 724.520948] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 724.520948] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] raise e [ 724.520948] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 724.520948] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] nwinfo = self.network_api.allocate_for_instance( [ 724.520948] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 724.520948] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] created_port_ids = self._update_ports_for_instance( [ 724.520948] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 724.520948] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] with excutils.save_and_reraise_exception(): [ 724.520948] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 724.520948] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] self.force_reraise() [ 724.521590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 724.521590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] raise self.value [ 724.521590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 724.521590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] updated_port = self._update_port( [ 724.521590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 724.521590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] _ensure_no_port_binding_failure(port) [ 724.521590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 724.521590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] raise exception.PortBindingFailed(port_id=port['id']) [ 724.521590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] nova.exception.PortBindingFailed: Binding failed for port 02d27e3c-0790-4e86-b3f1-9cf06e423758, please check neutron logs for more information. [ 724.521590] env[59620]: ERROR nova.compute.manager [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] [ 724.521590] env[59620]: DEBUG nova.compute.utils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Binding failed for port 02d27e3c-0790-4e86-b3f1-9cf06e423758, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 724.522118] env[59620]: DEBUG nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Build of instance 2e990d70-8e51-4900-9d9d-db920311a8ab was re-scheduled: Binding failed for port 02d27e3c-0790-4e86-b3f1-9cf06e423758, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 724.522379] env[59620]: DEBUG nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 724.522606] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "refresh_cache-2e990d70-8e51-4900-9d9d-db920311a8ab" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 724.524504] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquired lock "refresh_cache-2e990d70-8e51-4900-9d9d-db920311a8ab" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 724.524504] env[59620]: DEBUG nova.network.neutron [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 724.770917] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Successfully created port: 48c06748-2a4a-4d66-bca6-e8dde7b09b0f {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 724.818010] env[59620]: DEBUG nova.network.neutron [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 725.330327] env[59620]: DEBUG nova.network.neutron [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 725.339574] env[59620]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Releasing lock "refresh_cache-21f0c0f4-9fbd-4d34-be36-6dae6538bf9c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 725.339989] env[59620]: DEBUG nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 725.339989] env[59620]: DEBUG nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 725.340144] env[59620]: DEBUG nova.network.neutron [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 725.450257] env[59620]: DEBUG nova.network.neutron [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 725.459211] env[59620]: DEBUG nova.network.neutron [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 725.463281] env[59620]: DEBUG nova.network.neutron [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 725.472447] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Releasing lock "refresh_cache-2e990d70-8e51-4900-9d9d-db920311a8ab" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 725.472787] env[59620]: DEBUG nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 725.473055] env[59620]: DEBUG nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 725.473229] env[59620]: DEBUG nova.network.neutron [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 725.477837] env[59620]: INFO nova.compute.manager [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] [instance: 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c] Took 0.14 seconds to deallocate network for instance. [ 725.535860] env[59620]: DEBUG nova.network.neutron [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 725.545124] env[59620]: DEBUG nova.network.neutron [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 725.565313] env[59620]: INFO nova.compute.manager [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 2e990d70-8e51-4900-9d9d-db920311a8ab] Took 0.09 seconds to deallocate network for instance. [ 725.598159] env[59620]: INFO nova.scheduler.client.report [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Deleted allocations for instance 21f0c0f4-9fbd-4d34-be36-6dae6538bf9c [ 725.625047] env[59620]: DEBUG oslo_concurrency.lockutils [None req-7c8fd412-d97d-418d-a0d3-8aa4ff37eb76 tempest-ServerRescueTestJSONUnderV235-105789702 tempest-ServerRescueTestJSONUnderV235-105789702-project-member] Lock "21f0c0f4-9fbd-4d34-be36-6dae6538bf9c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.448s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 725.690642] env[59620]: INFO nova.scheduler.client.report [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Deleted allocations for instance 2e990d70-8e51-4900-9d9d-db920311a8ab [ 725.715845] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c6b55c46-7f72-40a1-b17e-865f57a02ebd tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "2e990d70-8e51-4900-9d9d-db920311a8ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.054s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.505549] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquiring lock "8e7db1a1-c3b2-44a9-a33b-d3544ef57f77" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.506077] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "8e7db1a1-c3b2-44a9-a33b-d3544ef57f77" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.518029] env[59620]: DEBUG nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 726.583437] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.583666] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 726.585885] env[59620]: INFO nova.compute.claims [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 726.745889] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e84e6512-fed8-4596-bce8-2feb3cbc6fc7 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.755128] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82bedd13-706e-4e32-8597-d312d2242a82 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.800986] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f74be67-1221-476f-9af3-c9f0115973d1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.813709] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17fad354-1d55-487e-9037-fe2bb9ddd1db {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 726.824184] env[59620]: DEBUG nova.compute.provider_tree [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 726.836916] env[59620]: DEBUG nova.scheduler.client.report [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 726.855769] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.272s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.856281] env[59620]: DEBUG nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 726.898968] env[59620]: DEBUG nova.compute.utils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 726.900309] env[59620]: DEBUG nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 726.900479] env[59620]: DEBUG nova.network.neutron [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 726.914528] env[59620]: DEBUG nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 726.986483] env[59620]: DEBUG nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 727.018047] env[59620]: DEBUG nova.virt.hardware [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 727.018047] env[59620]: DEBUG nova.virt.hardware [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 727.018306] env[59620]: DEBUG nova.virt.hardware [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 727.018619] env[59620]: DEBUG nova.virt.hardware [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 727.018884] env[59620]: DEBUG nova.virt.hardware [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 727.019145] env[59620]: DEBUG nova.virt.hardware [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 727.020334] env[59620]: DEBUG nova.virt.hardware [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 727.020790] env[59620]: DEBUG nova.virt.hardware [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 727.021318] env[59620]: DEBUG nova.virt.hardware [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 727.021318] env[59620]: DEBUG nova.virt.hardware [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 727.021547] env[59620]: DEBUG nova.virt.hardware [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 727.022920] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbc8f545-00ff-416e-9b7e-147443d14a17 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.037680] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77887a16-0a3f-42c0-9eb3-d810fc34ca5b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 727.109761] env[59620]: DEBUG nova.network.neutron [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Successfully created port: 17131708-5879-4378-bd78-cf76a598e578 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 727.302296] env[59620]: DEBUG nova.policy [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '666cd8bcdfa84942bc4ea57bf9ad5a00', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '69518eb542f84a3b9ce9cb5a4f266c9a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 728.586355] env[59620]: ERROR nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 4ca49a09-f39b-4b5d-8ffe-428938f9486e, please check neutron logs for more information. [ 728.586355] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 728.586355] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 728.586355] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 728.586355] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 728.586355] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 728.586355] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 728.586355] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 728.586355] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 728.586355] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 728.586355] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 728.586355] env[59620]: ERROR nova.compute.manager raise self.value [ 728.586355] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 728.586355] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 728.586355] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 728.586355] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 728.588295] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 728.588295] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 728.588295] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 4ca49a09-f39b-4b5d-8ffe-428938f9486e, please check neutron logs for more information. [ 728.588295] env[59620]: ERROR nova.compute.manager [ 728.588295] env[59620]: Traceback (most recent call last): [ 728.588295] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 728.588295] env[59620]: listener.cb(fileno) [ 728.588295] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 728.588295] env[59620]: result = function(*args, **kwargs) [ 728.588295] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 728.588295] env[59620]: return func(*args, **kwargs) [ 728.588295] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 728.588295] env[59620]: raise e [ 728.588295] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 728.588295] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 728.588295] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 728.588295] env[59620]: created_port_ids = self._update_ports_for_instance( [ 728.588295] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 728.588295] env[59620]: with excutils.save_and_reraise_exception(): [ 728.588295] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 728.588295] env[59620]: self.force_reraise() [ 728.588295] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 728.588295] env[59620]: raise self.value [ 728.588295] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 728.588295] env[59620]: updated_port = self._update_port( [ 728.588295] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 728.588295] env[59620]: _ensure_no_port_binding_failure(port) [ 728.588295] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 728.588295] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 728.589369] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 4ca49a09-f39b-4b5d-8ffe-428938f9486e, please check neutron logs for more information. [ 728.589369] env[59620]: Removing descriptor: 19 [ 728.589369] env[59620]: ERROR nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 4ca49a09-f39b-4b5d-8ffe-428938f9486e, please check neutron logs for more information. [ 728.589369] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Traceback (most recent call last): [ 728.589369] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 728.589369] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] yield resources [ 728.589369] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 728.589369] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] self.driver.spawn(context, instance, image_meta, [ 728.589369] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 728.589369] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 728.589369] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 728.589369] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] vm_ref = self.build_virtual_machine(instance, [ 728.589768] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 728.589768] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] vif_infos = vmwarevif.get_vif_info(self._session, [ 728.589768] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 728.589768] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] for vif in network_info: [ 728.589768] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 728.589768] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] return self._sync_wrapper(fn, *args, **kwargs) [ 728.589768] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 728.589768] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] self.wait() [ 728.589768] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 728.589768] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] self[:] = self._gt.wait() [ 728.589768] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 728.589768] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] return self._exit_event.wait() [ 728.589768] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 728.589768] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] result = hub.switch() [ 728.590110] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 728.590110] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] return self.greenlet.switch() [ 728.590110] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 728.590110] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] result = function(*args, **kwargs) [ 728.590110] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 728.590110] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] return func(*args, **kwargs) [ 728.590110] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 728.590110] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] raise e [ 728.590110] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 728.590110] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] nwinfo = self.network_api.allocate_for_instance( [ 728.590110] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 728.590110] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] created_port_ids = self._update_ports_for_instance( [ 728.590110] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 728.590641] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] with excutils.save_and_reraise_exception(): [ 728.590641] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 728.590641] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] self.force_reraise() [ 728.590641] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 728.590641] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] raise self.value [ 728.590641] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 728.590641] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] updated_port = self._update_port( [ 728.590641] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 728.590641] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] _ensure_no_port_binding_failure(port) [ 728.590641] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 728.590641] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] raise exception.PortBindingFailed(port_id=port['id']) [ 728.590641] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] nova.exception.PortBindingFailed: Binding failed for port 4ca49a09-f39b-4b5d-8ffe-428938f9486e, please check neutron logs for more information. [ 728.590641] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] [ 728.591651] env[59620]: INFO nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Terminating instance [ 728.593598] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "refresh_cache-449cb7ea-c7e9-411c-9c09-f451d892d32c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 728.593598] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquired lock "refresh_cache-449cb7ea-c7e9-411c-9c09-f451d892d32c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 728.593598] env[59620]: DEBUG nova.network.neutron [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 728.707151] env[59620]: DEBUG nova.network.neutron [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 728.837808] env[59620]: DEBUG nova.compute.manager [req-3d0b9051-138a-4563-bd5d-6dc256f4a786 req-b230501b-f5e4-4220-a43c-5155cd5e7ce1 service nova] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Received event network-changed-4ca49a09-f39b-4b5d-8ffe-428938f9486e {{(pid=59620) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 728.838524] env[59620]: DEBUG nova.compute.manager [req-3d0b9051-138a-4563-bd5d-6dc256f4a786 req-b230501b-f5e4-4220-a43c-5155cd5e7ce1 service nova] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Refreshing instance network info cache due to event network-changed-4ca49a09-f39b-4b5d-8ffe-428938f9486e. {{(pid=59620) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 728.838524] env[59620]: DEBUG oslo_concurrency.lockutils [req-3d0b9051-138a-4563-bd5d-6dc256f4a786 req-b230501b-f5e4-4220-a43c-5155cd5e7ce1 service nova] Acquiring lock "refresh_cache-449cb7ea-c7e9-411c-9c09-f451d892d32c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 729.653346] env[59620]: DEBUG nova.network.neutron [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.666774] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Releasing lock "refresh_cache-449cb7ea-c7e9-411c-9c09-f451d892d32c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 729.667270] env[59620]: DEBUG nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 729.667464] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 729.667779] env[59620]: DEBUG oslo_concurrency.lockutils [req-3d0b9051-138a-4563-bd5d-6dc256f4a786 req-b230501b-f5e4-4220-a43c-5155cd5e7ce1 service nova] Acquired lock "refresh_cache-449cb7ea-c7e9-411c-9c09-f451d892d32c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 729.667955] env[59620]: DEBUG nova.network.neutron [req-3d0b9051-138a-4563-bd5d-6dc256f4a786 req-b230501b-f5e4-4220-a43c-5155cd5e7ce1 service nova] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Refreshing network info cache for port 4ca49a09-f39b-4b5d-8ffe-428938f9486e {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 729.670482] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-05ec3614-aa1f-48cc-9455-fc50a245a933 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.684714] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf911fed-3b01-4fbe-b9a7-1ed78cc13f51 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.710086] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 449cb7ea-c7e9-411c-9c09-f451d892d32c could not be found. [ 729.710493] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 729.710786] env[59620]: INFO nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 729.711131] env[59620]: DEBUG oslo.service.loopingcall [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 729.711483] env[59620]: DEBUG nova.compute.manager [-] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 729.711688] env[59620]: DEBUG nova.network.neutron [-] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 729.785029] env[59620]: DEBUG nova.network.neutron [req-3d0b9051-138a-4563-bd5d-6dc256f4a786 req-b230501b-f5e4-4220-a43c-5155cd5e7ce1 service nova] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 729.802904] env[59620]: DEBUG nova.network.neutron [-] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 729.818048] env[59620]: DEBUG nova.network.neutron [-] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 729.828515] env[59620]: INFO nova.compute.manager [-] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Took 0.12 seconds to deallocate network for instance. [ 729.834862] env[59620]: DEBUG nova.compute.claims [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 729.834862] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 729.834862] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 729.957218] env[59620]: DEBUG nova.network.neutron [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Successfully created port: d7751aee-44fe-4f97-94a8-f9c9b96e7bfe {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 729.968742] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76d81872-56be-43b8-87aa-1a398b578fde {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.976940] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3c38353-76e4-43a2-bfd5-7292cf3ce0ef {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.010226] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50faeaa1-fa83-4972-b3bb-b65bee054e63 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.017891] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3beb168-f928-42a3-98d6-69aa09b2da8f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.032013] env[59620]: DEBUG nova.compute.provider_tree [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 730.041488] env[59620]: DEBUG nova.scheduler.client.report [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 730.059627] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.225s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.060315] env[59620]: ERROR nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 4ca49a09-f39b-4b5d-8ffe-428938f9486e, please check neutron logs for more information. [ 730.060315] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Traceback (most recent call last): [ 730.060315] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 730.060315] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] self.driver.spawn(context, instance, image_meta, [ 730.060315] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 730.060315] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 730.060315] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 730.060315] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] vm_ref = self.build_virtual_machine(instance, [ 730.060315] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 730.060315] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] vif_infos = vmwarevif.get_vif_info(self._session, [ 730.060315] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 730.060695] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] for vif in network_info: [ 730.060695] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 730.060695] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] return self._sync_wrapper(fn, *args, **kwargs) [ 730.060695] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 730.060695] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] self.wait() [ 730.060695] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 730.060695] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] self[:] = self._gt.wait() [ 730.060695] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 730.060695] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] return self._exit_event.wait() [ 730.060695] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 730.060695] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] result = hub.switch() [ 730.060695] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 730.060695] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] return self.greenlet.switch() [ 730.060695] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 730.061151] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] result = function(*args, **kwargs) [ 730.061151] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 730.061151] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] return func(*args, **kwargs) [ 730.061151] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 730.061151] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] raise e [ 730.061151] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 730.061151] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] nwinfo = self.network_api.allocate_for_instance( [ 730.061151] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 730.061151] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] created_port_ids = self._update_ports_for_instance( [ 730.061151] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 730.061151] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] with excutils.save_and_reraise_exception(): [ 730.061151] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 730.061151] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] self.force_reraise() [ 730.061556] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 730.061556] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] raise self.value [ 730.061556] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 730.061556] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] updated_port = self._update_port( [ 730.061556] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 730.061556] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] _ensure_no_port_binding_failure(port) [ 730.061556] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 730.061556] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] raise exception.PortBindingFailed(port_id=port['id']) [ 730.061556] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] nova.exception.PortBindingFailed: Binding failed for port 4ca49a09-f39b-4b5d-8ffe-428938f9486e, please check neutron logs for more information. [ 730.061556] env[59620]: ERROR nova.compute.manager [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] [ 730.061556] env[59620]: DEBUG nova.compute.utils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Binding failed for port 4ca49a09-f39b-4b5d-8ffe-428938f9486e, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 730.062852] env[59620]: DEBUG nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Build of instance 449cb7ea-c7e9-411c-9c09-f451d892d32c was re-scheduled: Binding failed for port 4ca49a09-f39b-4b5d-8ffe-428938f9486e, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 730.063206] env[59620]: DEBUG nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 730.063544] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "refresh_cache-449cb7ea-c7e9-411c-9c09-f451d892d32c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 730.333465] env[59620]: DEBUG nova.network.neutron [req-3d0b9051-138a-4563-bd5d-6dc256f4a786 req-b230501b-f5e4-4220-a43c-5155cd5e7ce1 service nova] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 730.342090] env[59620]: DEBUG oslo_concurrency.lockutils [req-3d0b9051-138a-4563-bd5d-6dc256f4a786 req-b230501b-f5e4-4220-a43c-5155cd5e7ce1 service nova] Releasing lock "refresh_cache-449cb7ea-c7e9-411c-9c09-f451d892d32c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 730.342752] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquired lock "refresh_cache-449cb7ea-c7e9-411c-9c09-f451d892d32c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 730.342752] env[59620]: DEBUG nova.network.neutron [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 730.403519] env[59620]: DEBUG nova.network.neutron [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 731.078204] env[59620]: DEBUG nova.network.neutron [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.095103] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Releasing lock "refresh_cache-449cb7ea-c7e9-411c-9c09-f451d892d32c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 731.095329] env[59620]: DEBUG nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 731.095508] env[59620]: DEBUG nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 731.095667] env[59620]: DEBUG nova.network.neutron [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 731.179605] env[59620]: DEBUG nova.network.neutron [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 731.189769] env[59620]: DEBUG nova.network.neutron [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.199654] env[59620]: INFO nova.compute.manager [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: 449cb7ea-c7e9-411c-9c09-f451d892d32c] Took 0.10 seconds to deallocate network for instance. [ 731.318130] env[59620]: INFO nova.scheduler.client.report [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Deleted allocations for instance 449cb7ea-c7e9-411c-9c09-f451d892d32c [ 731.337816] env[59620]: DEBUG oslo_concurrency.lockutils [None req-b76fd492-9944-4e8c-acb0-b73c7abb52f1 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "449cb7ea-c7e9-411c-9c09-f451d892d32c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.080s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.346830] env[59620]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "6da1d5a5-ff2a-478d-97b8-bf237f844bae" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.347129] env[59620]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "6da1d5a5-ff2a-478d-97b8-bf237f844bae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.357911] env[59620]: DEBUG nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 731.413919] env[59620]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.414294] env[59620]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.416168] env[59620]: INFO nova.compute.claims [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 731.587621] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb24d364-c708-460c-b6f1-1bd90706f8de {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.598044] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-884cb719-1c31-4cfc-83a6-c39822ec918a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.634789] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53e5a290-a05b-4495-90f9-329f8c62968f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.643072] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e2f24b7-e095-4707-9367-14055b9c5767 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.659559] env[59620]: DEBUG nova.compute.provider_tree [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 731.668819] env[59620]: DEBUG nova.scheduler.client.report [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 731.684574] env[59620]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.684903] env[59620]: DEBUG nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 731.719097] env[59620]: DEBUG nova.compute.utils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 731.721678] env[59620]: DEBUG nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 731.721678] env[59620]: DEBUG nova.network.neutron [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 731.738384] env[59620]: DEBUG nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 731.829379] env[59620]: DEBUG nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 731.855587] env[59620]: DEBUG nova.virt.hardware [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 731.855894] env[59620]: DEBUG nova.virt.hardware [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 731.855965] env[59620]: DEBUG nova.virt.hardware [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 731.856110] env[59620]: DEBUG nova.virt.hardware [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 731.856250] env[59620]: DEBUG nova.virt.hardware [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 731.856393] env[59620]: DEBUG nova.virt.hardware [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 731.856876] env[59620]: DEBUG nova.virt.hardware [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 731.857126] env[59620]: DEBUG nova.virt.hardware [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 731.857320] env[59620]: DEBUG nova.virt.hardware [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 731.857518] env[59620]: DEBUG nova.virt.hardware [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 731.857949] env[59620]: DEBUG nova.virt.hardware [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 731.859094] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-792b2681-dce2-437c-bdfd-6f8c7d3c9150 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 731.870770] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1485f060-4b0a-43b9-a7a8-65aa19f61201 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.176735] env[59620]: DEBUG nova.policy [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f40b2254538745bf9406408a158a55fd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bd8ec082967e4839810b492d2af942a6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 732.639110] env[59620]: DEBUG oslo_concurrency.lockutils [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "6da1d5a5-ff2a-478d-97b8-bf237f844bae" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.910153] env[59620]: ERROR nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port c0cf9857-5476-4c79-8c6b-6afade2558c1, please check neutron logs for more information. [ 733.910153] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 733.910153] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 733.910153] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 733.910153] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 733.910153] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 733.910153] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 733.910153] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 733.910153] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 733.910153] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 733.910153] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 733.910153] env[59620]: ERROR nova.compute.manager raise self.value [ 733.910153] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 733.910153] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 733.910153] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 733.910153] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 733.910797] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 733.910797] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 733.910797] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port c0cf9857-5476-4c79-8c6b-6afade2558c1, please check neutron logs for more information. [ 733.910797] env[59620]: ERROR nova.compute.manager [ 733.910797] env[59620]: Traceback (most recent call last): [ 733.910797] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 733.910797] env[59620]: listener.cb(fileno) [ 733.910797] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 733.910797] env[59620]: result = function(*args, **kwargs) [ 733.910797] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 733.910797] env[59620]: return func(*args, **kwargs) [ 733.910797] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 733.910797] env[59620]: raise e [ 733.910797] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 733.910797] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 733.910797] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 733.910797] env[59620]: created_port_ids = self._update_ports_for_instance( [ 733.910797] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 733.910797] env[59620]: with excutils.save_and_reraise_exception(): [ 733.910797] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 733.910797] env[59620]: self.force_reraise() [ 733.910797] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 733.910797] env[59620]: raise self.value [ 733.910797] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 733.910797] env[59620]: updated_port = self._update_port( [ 733.910797] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 733.910797] env[59620]: _ensure_no_port_binding_failure(port) [ 733.910797] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 733.910797] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 733.911637] env[59620]: nova.exception.PortBindingFailed: Binding failed for port c0cf9857-5476-4c79-8c6b-6afade2558c1, please check neutron logs for more information. [ 733.911637] env[59620]: Removing descriptor: 14 [ 733.911637] env[59620]: ERROR nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port c0cf9857-5476-4c79-8c6b-6afade2558c1, please check neutron logs for more information. [ 733.911637] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Traceback (most recent call last): [ 733.911637] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 733.911637] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] yield resources [ 733.911637] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 733.911637] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] self.driver.spawn(context, instance, image_meta, [ 733.911637] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 733.911637] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 733.911637] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 733.911637] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] vm_ref = self.build_virtual_machine(instance, [ 733.912020] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 733.912020] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] vif_infos = vmwarevif.get_vif_info(self._session, [ 733.912020] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 733.912020] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] for vif in network_info: [ 733.912020] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 733.912020] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] return self._sync_wrapper(fn, *args, **kwargs) [ 733.912020] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 733.912020] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] self.wait() [ 733.912020] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 733.912020] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] self[:] = self._gt.wait() [ 733.912020] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 733.912020] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] return self._exit_event.wait() [ 733.912020] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 733.912020] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] result = hub.switch() [ 733.912418] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 733.912418] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] return self.greenlet.switch() [ 733.912418] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 733.912418] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] result = function(*args, **kwargs) [ 733.912418] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 733.912418] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] return func(*args, **kwargs) [ 733.912418] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 733.912418] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] raise e [ 733.912418] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 733.912418] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] nwinfo = self.network_api.allocate_for_instance( [ 733.912418] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 733.912418] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] created_port_ids = self._update_ports_for_instance( [ 733.912418] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 733.912821] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] with excutils.save_and_reraise_exception(): [ 733.912821] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 733.912821] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] self.force_reraise() [ 733.912821] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 733.912821] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] raise self.value [ 733.912821] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 733.912821] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] updated_port = self._update_port( [ 733.912821] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 733.912821] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] _ensure_no_port_binding_failure(port) [ 733.912821] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 733.912821] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] raise exception.PortBindingFailed(port_id=port['id']) [ 733.912821] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] nova.exception.PortBindingFailed: Binding failed for port c0cf9857-5476-4c79-8c6b-6afade2558c1, please check neutron logs for more information. [ 733.912821] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] [ 733.914833] env[59620]: INFO nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Terminating instance [ 733.917062] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "refresh_cache-9999064a-7edc-4e2c-92fb-7a713194764c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 733.917062] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquired lock "refresh_cache-9999064a-7edc-4e2c-92fb-7a713194764c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 733.917062] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 734.012962] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 734.430819] env[59620]: ERROR nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 48c06748-2a4a-4d66-bca6-e8dde7b09b0f, please check neutron logs for more information. [ 734.430819] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 734.430819] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 734.430819] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 734.430819] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 734.430819] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 734.430819] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 734.430819] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 734.430819] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 734.430819] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 734.430819] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 734.430819] env[59620]: ERROR nova.compute.manager raise self.value [ 734.430819] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 734.430819] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 734.430819] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 734.430819] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 734.431386] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 734.431386] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 734.431386] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 48c06748-2a4a-4d66-bca6-e8dde7b09b0f, please check neutron logs for more information. [ 734.431386] env[59620]: ERROR nova.compute.manager [ 734.431386] env[59620]: Traceback (most recent call last): [ 734.431386] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 734.431386] env[59620]: listener.cb(fileno) [ 734.431386] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 734.431386] env[59620]: result = function(*args, **kwargs) [ 734.431386] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 734.431386] env[59620]: return func(*args, **kwargs) [ 734.431386] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 734.431386] env[59620]: raise e [ 734.431386] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 734.431386] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 734.431386] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 734.431386] env[59620]: created_port_ids = self._update_ports_for_instance( [ 734.431386] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 734.431386] env[59620]: with excutils.save_and_reraise_exception(): [ 734.431386] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 734.431386] env[59620]: self.force_reraise() [ 734.431386] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 734.431386] env[59620]: raise self.value [ 734.431386] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 734.431386] env[59620]: updated_port = self._update_port( [ 734.431386] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 734.431386] env[59620]: _ensure_no_port_binding_failure(port) [ 734.431386] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 734.431386] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 734.432298] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 48c06748-2a4a-4d66-bca6-e8dde7b09b0f, please check neutron logs for more information. [ 734.432298] env[59620]: Removing descriptor: 16 [ 734.432298] env[59620]: ERROR nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 48c06748-2a4a-4d66-bca6-e8dde7b09b0f, please check neutron logs for more information. [ 734.432298] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Traceback (most recent call last): [ 734.432298] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 734.432298] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] yield resources [ 734.432298] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 734.432298] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] self.driver.spawn(context, instance, image_meta, [ 734.432298] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 734.432298] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 734.432298] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 734.432298] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] vm_ref = self.build_virtual_machine(instance, [ 734.432656] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 734.432656] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] vif_infos = vmwarevif.get_vif_info(self._session, [ 734.432656] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 734.432656] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] for vif in network_info: [ 734.432656] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 734.432656] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] return self._sync_wrapper(fn, *args, **kwargs) [ 734.432656] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 734.432656] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] self.wait() [ 734.432656] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 734.432656] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] self[:] = self._gt.wait() [ 734.432656] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 734.432656] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] return self._exit_event.wait() [ 734.432656] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 734.432656] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] result = hub.switch() [ 734.433060] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 734.433060] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] return self.greenlet.switch() [ 734.433060] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 734.433060] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] result = function(*args, **kwargs) [ 734.433060] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 734.433060] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] return func(*args, **kwargs) [ 734.433060] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 734.433060] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] raise e [ 734.433060] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 734.433060] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] nwinfo = self.network_api.allocate_for_instance( [ 734.433060] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 734.433060] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] created_port_ids = self._update_ports_for_instance( [ 734.433060] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 734.433433] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] with excutils.save_and_reraise_exception(): [ 734.433433] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 734.433433] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] self.force_reraise() [ 734.433433] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 734.433433] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] raise self.value [ 734.433433] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 734.433433] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] updated_port = self._update_port( [ 734.433433] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 734.433433] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] _ensure_no_port_binding_failure(port) [ 734.433433] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 734.433433] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] raise exception.PortBindingFailed(port_id=port['id']) [ 734.433433] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] nova.exception.PortBindingFailed: Binding failed for port 48c06748-2a4a-4d66-bca6-e8dde7b09b0f, please check neutron logs for more information. [ 734.433433] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] [ 734.433840] env[59620]: INFO nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Terminating instance [ 734.433840] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "refresh_cache-a1f06df7-a38c-431d-98ee-7b3df8224ea1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 734.433840] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquired lock "refresh_cache-a1f06df7-a38c-431d-98ee-7b3df8224ea1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 734.433840] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 734.566334] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 734.748987] env[59620]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "7b5558e4-05fc-4755-accf-77228272884f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 734.749318] env[59620]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "7b5558e4-05fc-4755-accf-77228272884f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 734.763690] env[59620]: DEBUG nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 734.802725] env[59620]: DEBUG nova.network.neutron [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Successfully created port: 448e1b14-7b08-4172-9ff9-7e7ba7e84a68 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 734.821456] env[59620]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 734.821881] env[59620]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 734.823841] env[59620]: INFO nova.compute.claims [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 734.985911] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea5315da-57ef-47a6-8a2e-486a7df73687 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.994671] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec9dcde2-4e66-4362-845e-571dd3e446ce {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.031746] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dbb91a9-03a4-42f5-849a-e1e1b231b6be {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.038474] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4b6d2e2-29d6-4c38-8965-c97ea5f46f38 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.053667] env[59620]: DEBUG nova.compute.provider_tree [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 735.063386] env[59620]: DEBUG nova.scheduler.client.report [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 735.076735] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.084997] env[59620]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.085423] env[59620]: DEBUG nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 735.089218] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Releasing lock "refresh_cache-9999064a-7edc-4e2c-92fb-7a713194764c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 735.089584] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 735.089970] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 735.090239] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-272b03bd-31df-45aa-ad58-bf14092321e6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.099606] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11f1281e-ef44-4d8b-a3d3-d62c304d27b8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.119154] env[59620]: DEBUG nova.compute.utils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 735.125125] env[59620]: DEBUG nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 735.125339] env[59620]: DEBUG nova.network.neutron [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 735.128656] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9999064a-7edc-4e2c-92fb-7a713194764c could not be found. [ 735.128656] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 735.128656] env[59620]: INFO nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 735.128656] env[59620]: DEBUG oslo.service.loopingcall [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 735.129086] env[59620]: DEBUG nova.compute.manager [-] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 735.129086] env[59620]: DEBUG nova.network.neutron [-] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 735.134874] env[59620]: DEBUG nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 735.210660] env[59620]: DEBUG nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 735.214168] env[59620]: DEBUG nova.network.neutron [-] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 735.221859] env[59620]: DEBUG nova.network.neutron [-] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.235554] env[59620]: INFO nova.compute.manager [-] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Took 0.11 seconds to deallocate network for instance. [ 735.239839] env[59620]: DEBUG nova.compute.claims [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 735.240017] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.240240] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.246013] env[59620]: DEBUG nova.virt.hardware [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 735.246235] env[59620]: DEBUG nova.virt.hardware [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 735.246678] env[59620]: DEBUG nova.virt.hardware [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 735.246878] env[59620]: DEBUG nova.virt.hardware [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 735.247034] env[59620]: DEBUG nova.virt.hardware [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 735.247181] env[59620]: DEBUG nova.virt.hardware [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 735.247387] env[59620]: DEBUG nova.virt.hardware [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 735.247544] env[59620]: DEBUG nova.virt.hardware [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 735.247706] env[59620]: DEBUG nova.virt.hardware [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 735.247870] env[59620]: DEBUG nova.virt.hardware [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 735.248035] env[59620]: DEBUG nova.virt.hardware [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 735.249116] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acbeb3e1-91d5-4fbd-9a5e-a85cf932b76f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.259190] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02c5119d-3cef-41bb-85b3-d4cafd27c642 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.336221] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.351589] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Releasing lock "refresh_cache-a1f06df7-a38c-431d-98ee-7b3df8224ea1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 735.351997] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 735.352211] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 735.353727] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-00e347cf-1ef9-477b-a1cf-478c6352d5ba {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.365442] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-324e1c21-77c9-40cf-a0c9-bc60f770d6ab {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.396178] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a1f06df7-a38c-431d-98ee-7b3df8224ea1 could not be found. [ 735.396383] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 735.396633] env[59620]: INFO nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Took 0.04 seconds to destroy the instance on the hypervisor. [ 735.396901] env[59620]: DEBUG oslo.service.loopingcall [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 735.398691] env[59620]: DEBUG nova.compute.manager [-] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 735.398788] env[59620]: DEBUG nova.network.neutron [-] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 735.401081] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbb24488-d98e-4ff9-95d5-be6e5d3375e3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.409144] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed6d04ab-8c19-473c-9d57-902433f7ebc7 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.445667] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4390706-126a-4e4f-99a2-323f5ee316f3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.455643] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afa1221d-0e41-4335-8ebe-619762a4d71d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.471854] env[59620]: DEBUG nova.compute.provider_tree [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 735.486749] env[59620]: DEBUG nova.scheduler.client.report [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 735.503547] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.263s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.504176] env[59620]: ERROR nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port c0cf9857-5476-4c79-8c6b-6afade2558c1, please check neutron logs for more information. [ 735.504176] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Traceback (most recent call last): [ 735.504176] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 735.504176] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] self.driver.spawn(context, instance, image_meta, [ 735.504176] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 735.504176] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 735.504176] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 735.504176] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] vm_ref = self.build_virtual_machine(instance, [ 735.504176] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 735.504176] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] vif_infos = vmwarevif.get_vif_info(self._session, [ 735.504176] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 735.504650] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] for vif in network_info: [ 735.504650] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 735.504650] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] return self._sync_wrapper(fn, *args, **kwargs) [ 735.504650] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 735.504650] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] self.wait() [ 735.504650] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 735.504650] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] self[:] = self._gt.wait() [ 735.504650] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 735.504650] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] return self._exit_event.wait() [ 735.504650] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 735.504650] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] result = hub.switch() [ 735.504650] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 735.504650] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] return self.greenlet.switch() [ 735.504650] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 735.506135] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] result = function(*args, **kwargs) [ 735.506135] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 735.506135] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] return func(*args, **kwargs) [ 735.506135] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 735.506135] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] raise e [ 735.506135] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 735.506135] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] nwinfo = self.network_api.allocate_for_instance( [ 735.506135] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 735.506135] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] created_port_ids = self._update_ports_for_instance( [ 735.506135] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 735.506135] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] with excutils.save_and_reraise_exception(): [ 735.506135] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 735.506135] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] self.force_reraise() [ 735.507145] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 735.507145] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] raise self.value [ 735.507145] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 735.507145] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] updated_port = self._update_port( [ 735.507145] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 735.507145] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] _ensure_no_port_binding_failure(port) [ 735.507145] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 735.507145] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] raise exception.PortBindingFailed(port_id=port['id']) [ 735.507145] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] nova.exception.PortBindingFailed: Binding failed for port c0cf9857-5476-4c79-8c6b-6afade2558c1, please check neutron logs for more information. [ 735.507145] env[59620]: ERROR nova.compute.manager [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] [ 735.507145] env[59620]: DEBUG nova.compute.utils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Binding failed for port c0cf9857-5476-4c79-8c6b-6afade2558c1, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 735.508666] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Build of instance 9999064a-7edc-4e2c-92fb-7a713194764c was re-scheduled: Binding failed for port c0cf9857-5476-4c79-8c6b-6afade2558c1, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 735.508666] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 735.508666] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "refresh_cache-9999064a-7edc-4e2c-92fb-7a713194764c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 735.508666] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquired lock "refresh_cache-9999064a-7edc-4e2c-92fb-7a713194764c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 735.508895] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 735.673629] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 735.736662] env[59620]: DEBUG nova.network.neutron [-] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 735.746511] env[59620]: DEBUG nova.network.neutron [-] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.761045] env[59620]: INFO nova.compute.manager [-] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Took 0.36 seconds to deallocate network for instance. [ 735.766336] env[59620]: DEBUG nova.compute.claims [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 735.766336] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.766336] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.773107] env[59620]: DEBUG nova.policy [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7b24e88854ce4efb81412f81ab12f923', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '035fb02e7d5e4870a9853822e21bff7b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 735.927399] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d89c44a-ad30-4bfd-9ede-93e568a82cec {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.937224] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4253914d-c9aa-4171-9a03-6177eb4bb82c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.974025] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb8ce11f-88ad-4045-a971-9f93a1aa9ca8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.980397] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-329beb22-54aa-4e84-b711-e6ec031104b8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.993074] env[59620]: DEBUG nova.compute.provider_tree [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 736.014382] env[59620]: DEBUG nova.scheduler.client.report [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 736.036275] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.272s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.036882] env[59620]: ERROR nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 48c06748-2a4a-4d66-bca6-e8dde7b09b0f, please check neutron logs for more information. [ 736.036882] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Traceback (most recent call last): [ 736.036882] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 736.036882] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] self.driver.spawn(context, instance, image_meta, [ 736.036882] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 736.036882] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 736.036882] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 736.036882] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] vm_ref = self.build_virtual_machine(instance, [ 736.036882] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 736.036882] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] vif_infos = vmwarevif.get_vif_info(self._session, [ 736.036882] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 736.037320] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] for vif in network_info: [ 736.037320] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 736.037320] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] return self._sync_wrapper(fn, *args, **kwargs) [ 736.037320] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 736.037320] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] self.wait() [ 736.037320] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 736.037320] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] self[:] = self._gt.wait() [ 736.037320] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 736.037320] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] return self._exit_event.wait() [ 736.037320] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 736.037320] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] result = hub.switch() [ 736.037320] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 736.037320] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] return self.greenlet.switch() [ 736.037320] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 736.037830] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] result = function(*args, **kwargs) [ 736.037830] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 736.037830] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] return func(*args, **kwargs) [ 736.037830] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 736.037830] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] raise e [ 736.037830] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 736.037830] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] nwinfo = self.network_api.allocate_for_instance( [ 736.037830] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 736.037830] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] created_port_ids = self._update_ports_for_instance( [ 736.037830] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 736.037830] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] with excutils.save_and_reraise_exception(): [ 736.037830] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 736.037830] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] self.force_reraise() [ 736.038358] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 736.038358] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] raise self.value [ 736.038358] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 736.038358] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] updated_port = self._update_port( [ 736.038358] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 736.038358] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] _ensure_no_port_binding_failure(port) [ 736.038358] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 736.038358] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] raise exception.PortBindingFailed(port_id=port['id']) [ 736.038358] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] nova.exception.PortBindingFailed: Binding failed for port 48c06748-2a4a-4d66-bca6-e8dde7b09b0f, please check neutron logs for more information. [ 736.038358] env[59620]: ERROR nova.compute.manager [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] [ 736.038358] env[59620]: DEBUG nova.compute.utils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Binding failed for port 48c06748-2a4a-4d66-bca6-e8dde7b09b0f, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 736.039343] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Build of instance a1f06df7-a38c-431d-98ee-7b3df8224ea1 was re-scheduled: Binding failed for port 48c06748-2a4a-4d66-bca6-e8dde7b09b0f, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 736.039821] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 736.039821] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquiring lock "refresh_cache-a1f06df7-a38c-431d-98ee-7b3df8224ea1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 736.039989] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Acquired lock "refresh_cache-a1f06df7-a38c-431d-98ee-7b3df8224ea1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 736.040190] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 736.185528] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.453377] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.467263] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Releasing lock "refresh_cache-9999064a-7edc-4e2c-92fb-7a713194764c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 736.467483] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 736.468060] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 736.468060] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 736.565612] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 736.578231] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.587459] env[59620]: INFO nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: 9999064a-7edc-4e2c-92fb-7a713194764c] Took 0.12 seconds to deallocate network for instance. [ 736.707165] env[59620]: INFO nova.scheduler.client.report [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Deleted allocations for instance 9999064a-7edc-4e2c-92fb-7a713194764c [ 736.737235] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "9999064a-7edc-4e2c-92fb-7a713194764c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.467s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.885870] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquiring lock "50bbfdd5-bac5-4634-bc5d-c215a31889e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.886170] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "50bbfdd5-bac5-4634-bc5d-c215a31889e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.899037] env[59620]: DEBUG nova.compute.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 736.968939] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.968939] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.970188] env[59620]: INFO nova.compute.claims [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 737.725620] env[59620]: ERROR nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 17131708-5879-4378-bd78-cf76a598e578, please check neutron logs for more information. [ 737.725620] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 737.725620] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 737.725620] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 737.725620] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 737.725620] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 737.725620] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 737.725620] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 737.725620] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 737.725620] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 737.725620] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 737.725620] env[59620]: ERROR nova.compute.manager raise self.value [ 737.725620] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 737.725620] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 737.725620] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 737.725620] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 737.727071] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 737.727071] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 737.727071] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 17131708-5879-4378-bd78-cf76a598e578, please check neutron logs for more information. [ 737.727071] env[59620]: ERROR nova.compute.manager [ 737.727071] env[59620]: Traceback (most recent call last): [ 737.727071] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 737.727071] env[59620]: listener.cb(fileno) [ 737.727071] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 737.727071] env[59620]: result = function(*args, **kwargs) [ 737.727071] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 737.727071] env[59620]: return func(*args, **kwargs) [ 737.727071] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 737.727071] env[59620]: raise e [ 737.727071] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 737.727071] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 737.727071] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 737.727071] env[59620]: created_port_ids = self._update_ports_for_instance( [ 737.727071] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 737.727071] env[59620]: with excutils.save_and_reraise_exception(): [ 737.727071] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 737.727071] env[59620]: self.force_reraise() [ 737.727071] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 737.727071] env[59620]: raise self.value [ 737.727071] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 737.727071] env[59620]: updated_port = self._update_port( [ 737.727071] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 737.727071] env[59620]: _ensure_no_port_binding_failure(port) [ 737.727071] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 737.727071] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 737.728104] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 17131708-5879-4378-bd78-cf76a598e578, please check neutron logs for more information. [ 737.728104] env[59620]: Removing descriptor: 15 [ 737.728104] env[59620]: ERROR nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port d7751aee-44fe-4f97-94a8-f9c9b96e7bfe, please check neutron logs for more information. [ 737.728104] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 737.728104] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 737.728104] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 737.728104] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 737.728104] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 737.728104] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 737.728104] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 737.728104] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 737.728104] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 737.728104] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 737.728104] env[59620]: ERROR nova.compute.manager raise self.value [ 737.728104] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 737.728104] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 737.728506] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 737.728506] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 737.728506] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 737.728506] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 737.728506] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port d7751aee-44fe-4f97-94a8-f9c9b96e7bfe, please check neutron logs for more information. [ 737.728506] env[59620]: ERROR nova.compute.manager [ 737.728506] env[59620]: Traceback (most recent call last): [ 737.728506] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 737.728506] env[59620]: listener.cb(fileno) [ 737.728506] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 737.728506] env[59620]: result = function(*args, **kwargs) [ 737.728506] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 737.728506] env[59620]: return func(*args, **kwargs) [ 737.728506] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 737.728506] env[59620]: raise e [ 737.728506] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 737.728506] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 737.728506] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 737.728506] env[59620]: created_port_ids = self._update_ports_for_instance( [ 737.728506] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 737.728506] env[59620]: with excutils.save_and_reraise_exception(): [ 737.728506] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 737.728506] env[59620]: self.force_reraise() [ 737.728506] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 737.728506] env[59620]: raise self.value [ 737.728506] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 737.728506] env[59620]: updated_port = self._update_port( [ 737.728506] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 737.728506] env[59620]: _ensure_no_port_binding_failure(port) [ 737.729658] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 737.729658] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 737.729658] env[59620]: nova.exception.PortBindingFailed: Binding failed for port d7751aee-44fe-4f97-94a8-f9c9b96e7bfe, please check neutron logs for more information. [ 737.729658] env[59620]: Removing descriptor: 13 [ 737.729658] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 737.729658] env[59620]: DEBUG nova.compute.manager [req-4077c82f-91e8-4368-bb2d-3651d539a7b9 req-5897ff3d-44d6-4803-87e5-e08c68c98b03 service nova] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Received event network-changed-d7751aee-44fe-4f97-94a8-f9c9b96e7bfe {{(pid=59620) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 737.729658] env[59620]: DEBUG nova.compute.manager [req-4077c82f-91e8-4368-bb2d-3651d539a7b9 req-5897ff3d-44d6-4803-87e5-e08c68c98b03 service nova] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Refreshing instance network info cache due to event network-changed-d7751aee-44fe-4f97-94a8-f9c9b96e7bfe. {{(pid=59620) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 737.729658] env[59620]: DEBUG oslo_concurrency.lockutils [req-4077c82f-91e8-4368-bb2d-3651d539a7b9 req-5897ff3d-44d6-4803-87e5-e08c68c98b03 service nova] Acquiring lock "refresh_cache-8e7db1a1-c3b2-44a9-a33b-d3544ef57f77" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 737.730030] env[59620]: DEBUG oslo_concurrency.lockutils [req-4077c82f-91e8-4368-bb2d-3651d539a7b9 req-5897ff3d-44d6-4803-87e5-e08c68c98b03 service nova] Acquired lock "refresh_cache-8e7db1a1-c3b2-44a9-a33b-d3544ef57f77" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 737.730030] env[59620]: DEBUG nova.network.neutron [req-4077c82f-91e8-4368-bb2d-3651d539a7b9 req-5897ff3d-44d6-4803-87e5-e08c68c98b03 service nova] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Refreshing network info cache for port d7751aee-44fe-4f97-94a8-f9c9b96e7bfe {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 737.731755] env[59620]: ERROR nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 17131708-5879-4378-bd78-cf76a598e578, please check neutron logs for more information. [ 737.731755] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Traceback (most recent call last): [ 737.731755] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 737.731755] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] yield resources [ 737.731755] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 737.731755] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] self.driver.spawn(context, instance, image_meta, [ 737.731755] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 737.731755] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 737.731755] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 737.731755] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] vm_ref = self.build_virtual_machine(instance, [ 737.731755] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 737.732248] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] vif_infos = vmwarevif.get_vif_info(self._session, [ 737.732248] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 737.732248] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] for vif in network_info: [ 737.732248] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 737.732248] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] return self._sync_wrapper(fn, *args, **kwargs) [ 737.732248] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 737.732248] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] self.wait() [ 737.732248] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 737.732248] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] self[:] = self._gt.wait() [ 737.732248] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 737.732248] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] return self._exit_event.wait() [ 737.732248] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 737.732248] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] result = hub.switch() [ 737.732248] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 737.732815] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] return self.greenlet.switch() [ 737.732815] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 737.732815] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] result = function(*args, **kwargs) [ 737.732815] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 737.732815] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] return func(*args, **kwargs) [ 737.732815] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 737.732815] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] raise e [ 737.732815] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 737.732815] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] nwinfo = self.network_api.allocate_for_instance( [ 737.732815] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 737.732815] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] created_port_ids = self._update_ports_for_instance( [ 737.732815] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 737.732815] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] with excutils.save_and_reraise_exception(): [ 737.733319] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 737.733319] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] self.force_reraise() [ 737.733319] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 737.733319] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] raise self.value [ 737.733319] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 737.733319] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] updated_port = self._update_port( [ 737.733319] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 737.733319] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] _ensure_no_port_binding_failure(port) [ 737.733319] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 737.733319] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] raise exception.PortBindingFailed(port_id=port['id']) [ 737.733319] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] nova.exception.PortBindingFailed: Binding failed for port 17131708-5879-4378-bd78-cf76a598e578, please check neutron logs for more information. [ 737.733319] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] [ 737.734399] env[59620]: INFO nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Terminating instance [ 737.734399] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "refresh_cache-904eb823-1bb4-48b1-8460-c722cbc4652c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 737.734399] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquired lock "refresh_cache-904eb823-1bb4-48b1-8460-c722cbc4652c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 737.734399] env[59620]: DEBUG nova.network.neutron [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 737.737654] env[59620]: ERROR nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port d7751aee-44fe-4f97-94a8-f9c9b96e7bfe, please check neutron logs for more information. [ 737.737654] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Traceback (most recent call last): [ 737.737654] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 737.737654] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] yield resources [ 737.737654] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 737.737654] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] self.driver.spawn(context, instance, image_meta, [ 737.737654] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 737.737654] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] self._vmops.spawn(context, instance, image_meta, injected_files, [ 737.737654] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 737.737654] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] vm_ref = self.build_virtual_machine(instance, [ 737.737654] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 737.738157] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] vif_infos = vmwarevif.get_vif_info(self._session, [ 737.738157] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 737.738157] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] for vif in network_info: [ 737.738157] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 737.738157] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] return self._sync_wrapper(fn, *args, **kwargs) [ 737.738157] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 737.738157] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] self.wait() [ 737.738157] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 737.738157] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] self[:] = self._gt.wait() [ 737.738157] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 737.738157] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] return self._exit_event.wait() [ 737.738157] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 737.738157] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] result = hub.switch() [ 737.738157] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 737.738515] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] return self.greenlet.switch() [ 737.738515] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 737.738515] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] result = function(*args, **kwargs) [ 737.738515] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 737.738515] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] return func(*args, **kwargs) [ 737.738515] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 737.738515] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] raise e [ 737.738515] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 737.738515] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] nwinfo = self.network_api.allocate_for_instance( [ 737.738515] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 737.738515] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] created_port_ids = self._update_ports_for_instance( [ 737.738515] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 737.738515] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] with excutils.save_and_reraise_exception(): [ 737.738821] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 737.738821] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] self.force_reraise() [ 737.738821] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 737.738821] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] raise self.value [ 737.738821] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 737.738821] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] updated_port = self._update_port( [ 737.738821] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 737.738821] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] _ensure_no_port_binding_failure(port) [ 737.738821] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 737.738821] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] raise exception.PortBindingFailed(port_id=port['id']) [ 737.738821] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] nova.exception.PortBindingFailed: Binding failed for port d7751aee-44fe-4f97-94a8-f9c9b96e7bfe, please check neutron logs for more information. [ 737.738821] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] [ 737.739358] env[59620]: INFO nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Terminating instance [ 737.739358] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquiring lock "refresh_cache-8e7db1a1-c3b2-44a9-a33b-d3544ef57f77" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 737.745501] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Releasing lock "refresh_cache-a1f06df7-a38c-431d-98ee-7b3df8224ea1" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 737.745689] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 737.745857] env[59620]: DEBUG nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 737.746033] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 737.820624] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 737.835784] env[59620]: DEBUG nova.network.neutron [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 737.852049] env[59620]: INFO nova.compute.manager [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] [instance: a1f06df7-a38c-431d-98ee-7b3df8224ea1] Took 0.11 seconds to deallocate network for instance. [ 737.896829] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4282f916-a3d7-487e-87b3-4fa2e0dfbcaf {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 737.905771] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f9efa7b-ed47-44b2-9df8-815f957128ef {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 737.938746] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc3bab49-f404-402a-9b45-b2ec6b217ae7 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 737.948130] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dce8ee57-eb9a-43aa-9b0a-3f144bde3176 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 737.962278] env[59620]: DEBUG nova.compute.provider_tree [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 737.964676] env[59620]: INFO nova.scheduler.client.report [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Deleted allocations for instance a1f06df7-a38c-431d-98ee-7b3df8224ea1 [ 737.972647] env[59620]: DEBUG nova.scheduler.client.report [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 737.983807] env[59620]: DEBUG oslo_concurrency.lockutils [None req-c72486c5-7027-4a26-8bfa-084688e9cad9 tempest-MultipleCreateTestJSON-1361786026 tempest-MultipleCreateTestJSON-1361786026-project-member] Lock "a1f06df7-a38c-431d-98ee-7b3df8224ea1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.687s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 738.004802] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.037s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 738.005597] env[59620]: DEBUG nova.compute.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 738.025770] env[59620]: DEBUG nova.network.neutron [req-4077c82f-91e8-4368-bb2d-3651d539a7b9 req-5897ff3d-44d6-4803-87e5-e08c68c98b03 service nova] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 738.039537] env[59620]: DEBUG nova.network.neutron [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 738.050634] env[59620]: DEBUG nova.compute.utils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 738.051935] env[59620]: DEBUG nova.compute.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Not allocating networking since 'none' was specified. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 738.077248] env[59620]: DEBUG nova.compute.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 738.168137] env[59620]: DEBUG nova.compute.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 738.196417] env[59620]: DEBUG nova.virt.hardware [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 738.196644] env[59620]: DEBUG nova.virt.hardware [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 738.196788] env[59620]: DEBUG nova.virt.hardware [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 738.196961] env[59620]: DEBUG nova.virt.hardware [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 738.197112] env[59620]: DEBUG nova.virt.hardware [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 738.197250] env[59620]: DEBUG nova.virt.hardware [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 738.197448] env[59620]: DEBUG nova.virt.hardware [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 738.197605] env[59620]: DEBUG nova.virt.hardware [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 738.197762] env[59620]: DEBUG nova.virt.hardware [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 738.197913] env[59620]: DEBUG nova.virt.hardware [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 738.198091] env[59620]: DEBUG nova.virt.hardware [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 738.198953] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa7ecf07-6751-42e2-af4f-e1d6b9ecb29d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.204941] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 738.205240] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Starting heal instance info cache {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 738.205373] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Rebuilding the list of instances to heal {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 738.220669] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-faa20e6d-ead6-4b0a-983e-82ba1fbd0b87 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.224877] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Skipping network cache update for instance because it is Building. {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 738.224877] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Skipping network cache update for instance because it is Building. {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 738.225018] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Skipping network cache update for instance because it is Building. {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 738.225090] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Skipping network cache update for instance because it is Building. {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 738.225212] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Skipping network cache update for instance because it is Building. {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 738.225330] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Didn't find any instances for network info cache update. {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 738.235534] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Instance VIF info [] {{(pid=59620) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 738.240987] env[59620]: DEBUG nova.virt.vmwareapi.vm_util [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Creating folder: Project (8863ad7f29dc48a59a77f57a53f759c6). Parent ref: group-v280263. {{(pid=59620) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 738.241870] env[59620]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-78d976bb-cf21-416c-8657-c2d4caea7eec {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.252597] env[59620]: INFO nova.virt.vmwareapi.vm_util [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Created folder: Project (8863ad7f29dc48a59a77f57a53f759c6) in parent group-v280263. [ 738.253243] env[59620]: DEBUG nova.virt.vmwareapi.vm_util [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Creating folder: Instances. Parent ref: group-v280275. {{(pid=59620) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 738.253630] env[59620]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5fdaf0ad-524c-487b-afa2-cd75f0f7fa4e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.267335] env[59620]: INFO nova.virt.vmwareapi.vm_util [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Created folder: Instances in parent group-v280275. [ 738.267335] env[59620]: DEBUG oslo.service.loopingcall [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 738.267753] env[59620]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Creating VM on the ESX host {{(pid=59620) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 738.267989] env[59620]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b3c997f9-c5af-465f-bf89-4e4ac28a3d1a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.285937] env[59620]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 738.285937] env[59620]: value = "task-1308641" [ 738.285937] env[59620]: _type = "Task" [ 738.285937] env[59620]: } to complete. {{(pid=59620) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 738.293942] env[59620]: DEBUG oslo_vmware.api [-] Task: {'id': task-1308641, 'name': CreateVM_Task} progress is 0%. {{(pid=59620) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 738.334041] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Acquiring lock "d85db5e9-ce70-477d-bb5c-7665ab69b19a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.334574] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "d85db5e9-ce70-477d-bb5c-7665ab69b19a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 738.346127] env[59620]: DEBUG nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 738.416625] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.416875] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 738.418432] env[59620]: INFO nova.compute.claims [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 738.592579] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2472b1c6-eb95-4c42-9000-2b1eba5c3e31 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.603643] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3839aab0-b36a-42ed-9e75-fd73e8dff28e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.638761] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-559f8d43-9054-47d7-8f20-60e234902f5f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.648832] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34ba9a54-0bbe-4c6e-81c6-df01b368e431 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.667557] env[59620]: DEBUG nova.compute.provider_tree [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 738.677563] env[59620]: DEBUG nova.scheduler.client.report [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 738.705064] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.288s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 738.706838] env[59620]: DEBUG nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 738.743552] env[59620]: DEBUG nova.compute.utils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 738.745247] env[59620]: DEBUG nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 738.745531] env[59620]: DEBUG nova.network.neutron [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 738.759918] env[59620]: DEBUG nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 738.799731] env[59620]: DEBUG oslo_vmware.api [-] Task: {'id': task-1308641, 'name': CreateVM_Task, 'duration_secs': 0.26041} completed successfully. {{(pid=59620) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 738.800662] env[59620]: DEBUG nova.network.neutron [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Successfully created port: 6319fa19-1380-4966-b525-c126a2fb5462 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 738.803679] env[59620]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Created VM on the ESX host {{(pid=59620) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 738.804965] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2efa4364-ba59-4de9-978f-169a769ee710" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 738.805190] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2efa4364-ba59-4de9-978f-169a769ee710" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 738.805534] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2efa4364-ba59-4de9-978f-169a769ee710" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 738.810328] env[59620]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-01ef34f3-9188-4e87-b333-81e61d714930 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.820219] env[59620]: DEBUG oslo_vmware.api [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Waiting for the task: (returnval){ [ 738.820219] env[59620]: value = "session[52dd88a5-b9b5-fd00-4be4-1e40bbf1fd5a]5215c742-709d-1451-e4cc-088b2387b6d4" [ 738.820219] env[59620]: _type = "Task" [ 738.820219] env[59620]: } to complete. {{(pid=59620) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 738.830032] env[59620]: DEBUG oslo_vmware.api [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Task: {'id': session[52dd88a5-b9b5-fd00-4be4-1e40bbf1fd5a]5215c742-709d-1451-e4cc-088b2387b6d4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59620) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 738.835874] env[59620]: DEBUG nova.network.neutron [req-4077c82f-91e8-4368-bb2d-3651d539a7b9 req-5897ff3d-44d6-4803-87e5-e08c68c98b03 service nova] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 738.844572] env[59620]: DEBUG nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 738.848947] env[59620]: DEBUG oslo_concurrency.lockutils [req-4077c82f-91e8-4368-bb2d-3651d539a7b9 req-5897ff3d-44d6-4803-87e5-e08c68c98b03 service nova] Releasing lock "refresh_cache-8e7db1a1-c3b2-44a9-a33b-d3544ef57f77" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 738.849326] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquired lock "refresh_cache-8e7db1a1-c3b2-44a9-a33b-d3544ef57f77" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 738.849496] env[59620]: DEBUG nova.network.neutron [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 738.870379] env[59620]: DEBUG nova.virt.hardware [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 738.870826] env[59620]: DEBUG nova.virt.hardware [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 738.871051] env[59620]: DEBUG nova.virt.hardware [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 738.871375] env[59620]: DEBUG nova.virt.hardware [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 738.871621] env[59620]: DEBUG nova.virt.hardware [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 738.871852] env[59620]: DEBUG nova.virt.hardware [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 738.872160] env[59620]: DEBUG nova.virt.hardware [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 738.872420] env[59620]: DEBUG nova.virt.hardware [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 738.872671] env[59620]: DEBUG nova.virt.hardware [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 738.873170] env[59620]: DEBUG nova.virt.hardware [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 738.873170] env[59620]: DEBUG nova.virt.hardware [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 738.874473] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e6d662c-fc62-42ca-bbe3-bbded05fa756 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.883110] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca870972-acfb-45cc-a1ac-bb867ec938c7 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.975818] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 739.002918] env[59620]: DEBUG nova.policy [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b8c7ae6f85f24cc38bd0dfb4e56f713c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '51ad16571493443e908ec396deddcdfb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 739.142019] env[59620]: DEBUG nova.network.neutron [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.153037] env[59620]: DEBUG nova.network.neutron [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 739.157215] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Releasing lock "refresh_cache-904eb823-1bb4-48b1-8460-c722cbc4652c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 739.157811] env[59620]: DEBUG nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 739.158285] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 739.158909] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1611403e-b698-4e06-9e09-beef4cf62190 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.169491] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8613cc0b-d84c-4e0d-af85-ad96114ebbde {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.192874] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 904eb823-1bb4-48b1-8460-c722cbc4652c could not be found. [ 739.193160] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 739.193382] env[59620]: INFO nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 739.193734] env[59620]: DEBUG oslo.service.loopingcall [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 739.193917] env[59620]: DEBUG nova.compute.manager [-] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 739.194047] env[59620]: DEBUG nova.network.neutron [-] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 739.316360] env[59620]: DEBUG nova.network.neutron [-] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 739.332196] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2efa4364-ba59-4de9-978f-169a769ee710" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 739.332519] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Processing image 2efa4364-ba59-4de9-978f-169a769ee710 {{(pid=59620) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 739.332786] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2efa4364-ba59-4de9-978f-169a769ee710/2efa4364-ba59-4de9-978f-169a769ee710.vmdk" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 739.332967] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2efa4364-ba59-4de9-978f-169a769ee710/2efa4364-ba59-4de9-978f-169a769ee710.vmdk" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 739.333192] env[59620]: DEBUG nova.virt.vmwareapi.ds_util [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59620) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 739.333585] env[59620]: DEBUG nova.network.neutron [-] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.338092] env[59620]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-068667b4-37a1-48ba-bfa4-7a4081fc0b8d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.342745] env[59620]: INFO nova.compute.manager [-] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Took 0.15 seconds to deallocate network for instance. [ 739.346916] env[59620]: DEBUG nova.compute.claims [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 739.347135] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.347406] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.354723] env[59620]: DEBUG nova.virt.vmwareapi.ds_util [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59620) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 739.354913] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59620) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 739.355875] env[59620]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7394d730-994d-4640-b01c-6fb2f7478d4e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.361582] env[59620]: DEBUG oslo_vmware.api [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Waiting for the task: (returnval){ [ 739.361582] env[59620]: value = "session[52dd88a5-b9b5-fd00-4be4-1e40bbf1fd5a]52b1ede8-9209-afdd-6d12-f3436a38d9b1" [ 739.361582] env[59620]: _type = "Task" [ 739.361582] env[59620]: } to complete. {{(pid=59620) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 739.370239] env[59620]: DEBUG oslo_vmware.api [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Task: {'id': session[52dd88a5-b9b5-fd00-4be4-1e40bbf1fd5a]52b1ede8-9209-afdd-6d12-f3436a38d9b1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59620) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 739.547081] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc0cb81a-8c29-4c9b-add4-ffe14016b2e7 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.563527] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-491e29a2-7905-4399-93f0-c44ee58a9a52 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.602168] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-417ab560-2607-4abc-adad-5fc9598759e1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.613409] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88a1184a-b0ff-4e34-b44f-b3773fbd5cd1 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.631233] env[59620]: DEBUG nova.compute.provider_tree [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 739.645737] env[59620]: DEBUG nova.scheduler.client.report [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 739.668975] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.321s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.670728] env[59620]: ERROR nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 17131708-5879-4378-bd78-cf76a598e578, please check neutron logs for more information. [ 739.670728] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Traceback (most recent call last): [ 739.670728] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 739.670728] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] self.driver.spawn(context, instance, image_meta, [ 739.670728] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 739.670728] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 739.670728] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 739.670728] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] vm_ref = self.build_virtual_machine(instance, [ 739.670728] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 739.670728] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] vif_infos = vmwarevif.get_vif_info(self._session, [ 739.670728] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 739.671047] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] for vif in network_info: [ 739.671047] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 739.671047] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] return self._sync_wrapper(fn, *args, **kwargs) [ 739.671047] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 739.671047] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] self.wait() [ 739.671047] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 739.671047] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] self[:] = self._gt.wait() [ 739.671047] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 739.671047] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] return self._exit_event.wait() [ 739.671047] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 739.671047] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] result = hub.switch() [ 739.671047] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 739.671047] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] return self.greenlet.switch() [ 739.671047] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 739.671581] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] result = function(*args, **kwargs) [ 739.671581] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 739.671581] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] return func(*args, **kwargs) [ 739.671581] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 739.671581] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] raise e [ 739.671581] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 739.671581] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] nwinfo = self.network_api.allocate_for_instance( [ 739.671581] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 739.671581] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] created_port_ids = self._update_ports_for_instance( [ 739.671581] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 739.671581] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] with excutils.save_and_reraise_exception(): [ 739.671581] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 739.671581] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] self.force_reraise() [ 739.672017] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 739.672017] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] raise self.value [ 739.672017] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 739.672017] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] updated_port = self._update_port( [ 739.672017] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 739.672017] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] _ensure_no_port_binding_failure(port) [ 739.672017] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 739.672017] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] raise exception.PortBindingFailed(port_id=port['id']) [ 739.672017] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] nova.exception.PortBindingFailed: Binding failed for port 17131708-5879-4378-bd78-cf76a598e578, please check neutron logs for more information. [ 739.672017] env[59620]: ERROR nova.compute.manager [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] [ 739.672017] env[59620]: DEBUG nova.compute.utils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Binding failed for port 17131708-5879-4378-bd78-cf76a598e578, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 739.674316] env[59620]: DEBUG nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Build of instance 904eb823-1bb4-48b1-8460-c722cbc4652c was re-scheduled: Binding failed for port 17131708-5879-4378-bd78-cf76a598e578, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 739.674800] env[59620]: DEBUG nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 739.675109] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquiring lock "refresh_cache-904eb823-1bb4-48b1-8460-c722cbc4652c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 739.675176] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Acquired lock "refresh_cache-904eb823-1bb4-48b1-8460-c722cbc4652c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 739.675334] env[59620]: DEBUG nova.network.neutron [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 739.717639] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "fea7d2f4-199d-4c76-84cd-4ee7820990ec" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.717827] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "fea7d2f4-199d-4c76-84cd-4ee7820990ec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.729074] env[59620]: DEBUG nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 739.802331] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.802616] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.805103] env[59620]: INFO nova.compute.claims [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 739.815213] env[59620]: DEBUG nova.network.neutron [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 739.870741] env[59620]: DEBUG nova.network.neutron [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.885335] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Releasing lock "refresh_cache-8e7db1a1-c3b2-44a9-a33b-d3544ef57f77" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 739.885791] env[59620]: DEBUG nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 739.885971] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 739.886572] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Preparing fetch location {{(pid=59620) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 739.886572] env[59620]: DEBUG nova.virt.vmwareapi.ds_util [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Creating directory with path [datastore1] vmware_temp/29544aff-9dd0-4af0-800c-bebbc10731fe/2efa4364-ba59-4de9-978f-169a769ee710 {{(pid=59620) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 739.886935] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b2c54199-5bbb-4aee-8517-2f0f13dacbaf {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.889387] env[59620]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7fb25446-7665-46a5-918f-50cdf39e86d6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.900218] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0efc94db-010a-49a3-ac70-8eb8244ecdd8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.921269] env[59620]: DEBUG nova.virt.vmwareapi.ds_util [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Created directory with path [datastore1] vmware_temp/29544aff-9dd0-4af0-800c-bebbc10731fe/2efa4364-ba59-4de9-978f-169a769ee710 {{(pid=59620) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 739.921456] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Fetch image to [datastore1] vmware_temp/29544aff-9dd0-4af0-800c-bebbc10731fe/2efa4364-ba59-4de9-978f-169a769ee710/tmp-sparse.vmdk {{(pid=59620) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 739.921669] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Downloading image file data 2efa4364-ba59-4de9-978f-169a769ee710 to [datastore1] vmware_temp/29544aff-9dd0-4af0-800c-bebbc10731fe/2efa4364-ba59-4de9-978f-169a769ee710/tmp-sparse.vmdk on the data store datastore1 {{(pid=59620) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 739.923055] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-773b8139-1fe4-485f-b999-7e2ba69eb29f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.938768] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-def4858c-b5fd-44e8-9026-b2284cae6add {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.944563] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77 could not be found. [ 739.944799] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 739.944975] env[59620]: INFO nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Took 0.06 seconds to destroy the instance on the hypervisor. [ 739.945231] env[59620]: DEBUG oslo.service.loopingcall [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 739.948477] env[59620]: DEBUG nova.compute.manager [-] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 739.948616] env[59620]: DEBUG nova.network.neutron [-] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 739.958963] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83cf726f-b68e-4837-919d-29191e055a84 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.965646] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 739.966093] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 739.966630] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 739.966793] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 739.967985] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59620) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 740.006841] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d5842c7-d055-4c4f-846b-b53b6840190d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.013266] env[59620]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c980e1ea-1d52-46f6-9ef2-3b6a93d297f8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.063250] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca6b6468-a951-45bd-a396-e96352989fe6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.071645] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cca45062-707b-48af-afd3-0ab6de60d83e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.105901] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37c21345-a431-43ca-8873-b807d9fbf313 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.108761] env[59620]: DEBUG nova.virt.vmwareapi.images [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Downloading image file data 2efa4364-ba59-4de9-978f-169a769ee710 to the data store datastore1 {{(pid=59620) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 740.115410] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94d4b2f0-a134-4437-b0d9-a8461d7bce14 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.128984] env[59620]: DEBUG nova.compute.provider_tree [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 740.140903] env[59620]: DEBUG nova.scheduler.client.report [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 740.168458] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.366s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.168910] env[59620]: DEBUG nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 740.173082] env[59620]: DEBUG oslo_vmware.rw_handles [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/29544aff-9dd0-4af0-800c-bebbc10731fe/2efa4364-ba59-4de9-978f-169a769ee710/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59620) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 740.229193] env[59620]: DEBUG nova.compute.utils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 740.234216] env[59620]: DEBUG nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 740.234216] env[59620]: DEBUG nova.network.neutron [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 740.234561] env[59620]: DEBUG oslo_vmware.rw_handles [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Completed reading data from the image iterator. {{(pid=59620) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 740.234700] env[59620]: DEBUG oslo_vmware.rw_handles [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/29544aff-9dd0-4af0-800c-bebbc10731fe/2efa4364-ba59-4de9-978f-169a769ee710/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59620) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 740.241291] env[59620]: DEBUG nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 740.269665] env[59620]: DEBUG nova.network.neutron [-] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 740.283019] env[59620]: DEBUG nova.network.neutron [-] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 740.298641] env[59620]: INFO nova.compute.manager [-] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Took 0.35 seconds to deallocate network for instance. [ 740.306436] env[59620]: DEBUG nova.compute.claims [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 740.306436] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.306436] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.335950] env[59620]: DEBUG nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 740.364633] env[59620]: DEBUG nova.virt.hardware [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 740.364851] env[59620]: DEBUG nova.virt.hardware [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 740.364997] env[59620]: DEBUG nova.virt.hardware [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 740.369327] env[59620]: DEBUG nova.virt.hardware [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 740.369484] env[59620]: DEBUG nova.virt.hardware [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 740.369628] env[59620]: DEBUG nova.virt.hardware [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 740.369832] env[59620]: DEBUG nova.virt.hardware [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 740.369987] env[59620]: DEBUG nova.virt.hardware [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 740.370162] env[59620]: DEBUG nova.virt.hardware [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 740.370314] env[59620]: DEBUG nova.virt.hardware [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 740.370507] env[59620]: DEBUG nova.virt.hardware [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 740.371880] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84bd8076-ccc2-4e4c-8162-09b82449d8e8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.384284] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55384c3c-7d49-471e-ae2d-4cebf9696338 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.478845] env[59620]: DEBUG nova.network.neutron [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 740.484129] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22fb5813-e6a4-45b0-8b35-f926088ab25b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.491742] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cad4585-a05a-48e2-8889-20e0a407ad49 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.497136] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Releasing lock "refresh_cache-904eb823-1bb4-48b1-8460-c722cbc4652c" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 740.497448] env[59620]: DEBUG nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 740.497549] env[59620]: DEBUG nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 740.497689] env[59620]: DEBUG nova.network.neutron [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 740.528345] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13e862a0-687d-42ae-afbd-c355c4147454 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.536458] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1013c289-d20a-43ef-a7f7-afe100225e0c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.551870] env[59620]: DEBUG nova.compute.provider_tree [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 740.567212] env[59620]: DEBUG nova.scheduler.client.report [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 740.592939] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.287s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.593619] env[59620]: ERROR nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port d7751aee-44fe-4f97-94a8-f9c9b96e7bfe, please check neutron logs for more information. [ 740.593619] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Traceback (most recent call last): [ 740.593619] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 740.593619] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] self.driver.spawn(context, instance, image_meta, [ 740.593619] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 740.593619] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] self._vmops.spawn(context, instance, image_meta, injected_files, [ 740.593619] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 740.593619] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] vm_ref = self.build_virtual_machine(instance, [ 740.593619] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 740.593619] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] vif_infos = vmwarevif.get_vif_info(self._session, [ 740.593619] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 740.593955] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] for vif in network_info: [ 740.593955] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 740.593955] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] return self._sync_wrapper(fn, *args, **kwargs) [ 740.593955] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 740.593955] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] self.wait() [ 740.593955] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 740.593955] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] self[:] = self._gt.wait() [ 740.593955] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 740.593955] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] return self._exit_event.wait() [ 740.593955] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 740.593955] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] result = hub.switch() [ 740.593955] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 740.593955] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] return self.greenlet.switch() [ 740.593955] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 740.594330] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] result = function(*args, **kwargs) [ 740.594330] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 740.594330] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] return func(*args, **kwargs) [ 740.594330] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 740.594330] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] raise e [ 740.594330] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 740.594330] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] nwinfo = self.network_api.allocate_for_instance( [ 740.594330] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 740.594330] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] created_port_ids = self._update_ports_for_instance( [ 740.594330] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 740.594330] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] with excutils.save_and_reraise_exception(): [ 740.594330] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 740.594330] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] self.force_reraise() [ 740.594724] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 740.594724] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] raise self.value [ 740.594724] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 740.594724] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] updated_port = self._update_port( [ 740.594724] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 740.594724] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] _ensure_no_port_binding_failure(port) [ 740.594724] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 740.594724] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] raise exception.PortBindingFailed(port_id=port['id']) [ 740.594724] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] nova.exception.PortBindingFailed: Binding failed for port d7751aee-44fe-4f97-94a8-f9c9b96e7bfe, please check neutron logs for more information. [ 740.594724] env[59620]: ERROR nova.compute.manager [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] [ 740.594724] env[59620]: DEBUG nova.compute.utils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Binding failed for port d7751aee-44fe-4f97-94a8-f9c9b96e7bfe, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 740.597187] env[59620]: DEBUG nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Build of instance 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77 was re-scheduled: Binding failed for port d7751aee-44fe-4f97-94a8-f9c9b96e7bfe, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 740.597187] env[59620]: DEBUG nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 740.597187] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquiring lock "refresh_cache-8e7db1a1-c3b2-44a9-a33b-d3544ef57f77" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 740.597187] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Acquired lock "refresh_cache-8e7db1a1-c3b2-44a9-a33b-d3544ef57f77" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 740.597542] env[59620]: DEBUG nova.network.neutron [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 740.607200] env[59620]: DEBUG nova.network.neutron [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 740.616093] env[59620]: DEBUG nova.policy [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6316c8b7da8d4d3c97b2693b33729c52', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cc4c8738af2b48f981e5f2feadb41a59', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 740.617761] env[59620]: DEBUG nova.network.neutron [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 740.634966] env[59620]: INFO nova.compute.manager [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] [instance: 904eb823-1bb4-48b1-8460-c722cbc4652c] Took 0.14 seconds to deallocate network for instance. [ 740.660453] env[59620]: DEBUG nova.network.neutron [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 740.676362] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Acquiring lock "671639d6-3103-4eeb-86d3-b858a3919396" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.676614] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "671639d6-3103-4eeb-86d3-b858a3919396" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.004s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.699575] env[59620]: DEBUG nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 740.734565] env[59620]: INFO nova.scheduler.client.report [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Deleted allocations for instance 904eb823-1bb4-48b1-8460-c722cbc4652c [ 740.770072] env[59620]: DEBUG oslo_concurrency.lockutils [None req-5d45c3e2-ee5f-4b14-adf4-6717cd14f410 tempest-ImagesTestJSON-1135444459 tempest-ImagesTestJSON-1135444459-project-member] Lock "904eb823-1bb4-48b1-8460-c722cbc4652c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.530s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.782402] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.782680] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.787336] env[59620]: INFO nova.compute.claims [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 740.961649] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 741.035098] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef1c60a7-7456-48b2-bb9c-c1bfe89e6392 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.043569] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c68b3134-8b57-413f-bd48-6c85192bf96d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.077563] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38151436-3c2f-45d9-8030-6cc76944e9c6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.085615] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6342d65e-b850-43db-a673-07043c9a91b7 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.102796] env[59620]: DEBUG nova.compute.provider_tree [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 741.110684] env[59620]: DEBUG nova.scheduler.client.report [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 741.117288] env[59620]: DEBUG nova.network.neutron [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.129037] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Releasing lock "refresh_cache-8e7db1a1-c3b2-44a9-a33b-d3544ef57f77" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 741.129265] env[59620]: DEBUG nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 741.129440] env[59620]: DEBUG nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 741.129594] env[59620]: DEBUG nova.network.neutron [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 741.133267] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.351s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.133859] env[59620]: DEBUG nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 741.175365] env[59620]: DEBUG nova.compute.utils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 741.175365] env[59620]: DEBUG nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 741.175536] env[59620]: DEBUG nova.network.neutron [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 741.196118] env[59620]: DEBUG nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 741.228576] env[59620]: DEBUG nova.network.neutron [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 741.240468] env[59620]: DEBUG nova.network.neutron [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 741.254456] env[59620]: INFO nova.compute.manager [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] [instance: 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77] Took 0.12 seconds to deallocate network for instance. [ 741.292911] env[59620]: DEBUG nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 741.320119] env[59620]: DEBUG nova.virt.hardware [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 741.320357] env[59620]: DEBUG nova.virt.hardware [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 741.320618] env[59620]: DEBUG nova.virt.hardware [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 741.320836] env[59620]: DEBUG nova.virt.hardware [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 741.320986] env[59620]: DEBUG nova.virt.hardware [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 741.321144] env[59620]: DEBUG nova.virt.hardware [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 741.321344] env[59620]: DEBUG nova.virt.hardware [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 741.321501] env[59620]: DEBUG nova.virt.hardware [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 741.321661] env[59620]: DEBUG nova.virt.hardware [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 741.321817] env[59620]: DEBUG nova.virt.hardware [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 741.322727] env[59620]: DEBUG nova.virt.hardware [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 741.323088] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c41f501-552b-4329-afe7-0ce03084bd2b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.334824] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8eabdb31-c14e-4c87-9882-6e94c99c50fe {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.385232] env[59620]: INFO nova.scheduler.client.report [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Deleted allocations for instance 8e7db1a1-c3b2-44a9-a33b-d3544ef57f77 [ 741.413904] env[59620]: DEBUG oslo_concurrency.lockutils [None req-0667b0c4-3702-4402-9355-e46427832c0e tempest-AttachInterfacesTestJSON-1546053889 tempest-AttachInterfacesTestJSON-1546053889-project-member] Lock "8e7db1a1-c3b2-44a9-a33b-d3544ef57f77" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.908s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.491915] env[59620]: DEBUG nova.network.neutron [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Successfully created port: 662701b8-6fd9-446f-92d4-22ec5381326d {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 741.716353] env[59620]: DEBUG nova.policy [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2e6de82121c64c36a180db6eb2734ee2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '03e26d763fe749bebf9aab124185bffe', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 741.959653] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 741.959890] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager.update_available_resource {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 741.970594] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.970895] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 741.970950] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 741.971113] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59620) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 741.975738] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11eff16f-b180-4789-b0b6-d7c0e933d954 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.988019] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7acebfbb-8a39-4d02-9c29-dd6085ecd9d3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.002772] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7ba8f94-a914-45ac-8478-457255ed2ca0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.010645] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-452d5f63-b94d-49d5-b30d-078e4b5bee29 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.044988] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181465MB free_disk=136GB free_vcpus=48 pci_devices=None {{(pid=59620) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 742.045238] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.045434] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 742.104355] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Instance 6da1d5a5-ff2a-478d-97b8-bf237f844bae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59620) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 742.104355] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Instance 7b5558e4-05fc-4755-accf-77228272884f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59620) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 742.104355] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Instance 50bbfdd5-bac5-4634-bc5d-c215a31889e9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59620) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 742.104355] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Instance d85db5e9-ce70-477d-bb5c-7665ab69b19a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59620) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 742.104533] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Instance fea7d2f4-199d-4c76-84cd-4ee7820990ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59620) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 742.104533] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Instance 671639d6-3103-4eeb-86d3-b858a3919396 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59620) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 742.104533] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=59620) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 742.104667] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=59620) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 742.216029] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11780bb3-baab-4496-84fe-eab8db59c575 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.222696] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da828f1b-ede4-4b1c-ac3f-2884ce635d7d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.257503] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0f7c7ab-20d9-422f-968c-b180736c27e9 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.270008] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3e657e5-39dc-4ede-9619-fcf05a466e35 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.284854] env[59620]: DEBUG nova.compute.provider_tree [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 742.293562] env[59620]: DEBUG nova.scheduler.client.report [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 742.316302] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59620) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 742.316485] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.271s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 742.924191] env[59620]: DEBUG nova.network.neutron [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Successfully created port: ba41efcb-e78a-4f3e-a135-67f3325e12a4 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 743.621169] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "f61f8046-f2ee-4de3-9c45-de52c2849399" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.621483] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "f61f8046-f2ee-4de3-9c45-de52c2849399" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.643426] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 743.691158] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "7c8fc3b5-2d31-498a-ac7c-a8bffe415d70" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.691459] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "7c8fc3b5-2d31-498a-ac7c-a8bffe415d70" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.707754] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 743.721277] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.722028] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.723555] env[59620]: INFO nova.compute.claims [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 743.742033] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "5c21177e-6cff-414f-bff1-bac166929cab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.742033] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "5c21177e-6cff-414f-bff1-bac166929cab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.761189] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 743.785345] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.829136] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.927959] env[59620]: DEBUG nova.network.neutron [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Successfully created port: a4475db3-1cfd-47df-9fd1-b9a0d4fa35a1 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 743.945857] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4326d445-17dd-4109-945d-a2f34148b3f0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.957657] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a247076-15e3-4009-88d7-2c40fed6ac3f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.994571] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1848c5c-4809-4127-8574-12d826b1462e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.003442] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b700a99f-8e6c-444d-9d1a-1d1809114bf6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.018202] env[59620]: DEBUG nova.compute.provider_tree [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 744.029304] env[59620]: DEBUG nova.scheduler.client.report [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 744.043458] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.043982] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 744.046384] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.261s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 744.047845] env[59620]: INFO nova.compute.claims [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 744.090907] env[59620]: DEBUG nova.compute.utils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 744.091287] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 744.091389] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 744.108085] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 744.190749] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 744.220609] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 744.220856] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 744.221014] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 744.221192] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 744.221330] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 744.221470] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 744.221670] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 744.221825] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 744.221986] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 744.222301] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 744.222482] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 744.223788] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1c84702-0884-4f7c-a952-8cefdf0acd63 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.235960] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bbdb500-f125-4af7-9510-cbc818898fe0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.254166] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de6aeb11-e3c6-4424-bde6-4b327617c29e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.261452] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f4e03c9-80d7-4331-af66-e3f73f69d30a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.292375] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-041d012e-5901-4aa3-8747-c104ffac366b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.300212] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3be9574-b877-427c-8b3e-f1a342c2ded0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.315472] env[59620]: DEBUG nova.compute.provider_tree [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 744.324592] env[59620]: DEBUG nova.scheduler.client.report [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 744.340615] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.294s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.342107] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 744.343543] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.515s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 744.345046] env[59620]: INFO nova.compute.claims [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 744.377570] env[59620]: DEBUG nova.compute.utils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 744.379414] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 744.379606] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 744.394484] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 744.420025] env[59620]: DEBUG nova.policy [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6e7b7c11c6f14d82aae4d866a76aaa73', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4efa8e2dcbc492ea32aa20745286b46', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 744.480744] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 744.511808] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 744.511808] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 744.511808] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 744.512154] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 744.512154] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 744.512154] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 744.512154] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 744.512154] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 744.512329] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 744.512577] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 744.512851] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 744.514545] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c99bec72-aa1e-4077-9b36-df04d3e50502 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.531936] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ad03460-edab-43b6-b423-6f20cda8f9cc {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.594864] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a2fd8dc-e9fa-472f-910d-5bab2ed94c3f {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.602962] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8902bb2e-d159-4b66-9d2d-3c02fad65bb8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.640096] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d26d4ccd-d006-4126-81cb-ba1c2839a716 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.649178] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f56850ac-c4aa-4014-a412-92ca5d4c2612 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.661957] env[59620]: DEBUG nova.compute.provider_tree [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 744.674063] env[59620]: DEBUG nova.scheduler.client.report [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 744.695994] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.352s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.696550] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 744.744519] env[59620]: DEBUG nova.compute.utils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 744.746018] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 744.746018] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 744.762332] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 744.849697] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 744.875345] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 744.875452] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 744.877889] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 744.877889] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 744.877889] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 744.877889] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 744.877889] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 744.878044] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 744.878044] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 744.878044] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 744.878044] env[59620]: DEBUG nova.virt.hardware [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 744.878044] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11ad0b83-4959-4259-a8a0-1ab8928f0edf {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.892339] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba317ae2-e4b0-4b5e-a86a-582baa4347b8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.964125] env[59620]: DEBUG nova.policy [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6e7b7c11c6f14d82aae4d866a76aaa73', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4efa8e2dcbc492ea32aa20745286b46', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 744.972316] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Acquiring lock "d26dfe85-1a71-48e1-b462-f26f1327a9e7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 744.972534] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "d26dfe85-1a71-48e1-b462-f26f1327a9e7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 744.987310] env[59620]: DEBUG nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Starting instance... {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 745.031094] env[59620]: DEBUG nova.policy [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6e7b7c11c6f14d82aae4d866a76aaa73', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4efa8e2dcbc492ea32aa20745286b46', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 745.050045] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 745.050315] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 745.051886] env[59620]: INFO nova.compute.claims [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 745.098194] env[59620]: ERROR nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 448e1b14-7b08-4172-9ff9-7e7ba7e84a68, please check neutron logs for more information. [ 745.098194] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 745.098194] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 745.098194] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 745.098194] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 745.098194] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 745.098194] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 745.098194] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 745.098194] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 745.098194] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 745.098194] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 745.098194] env[59620]: ERROR nova.compute.manager raise self.value [ 745.098194] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 745.098194] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 745.098194] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 745.098194] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 745.098688] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 745.098688] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 745.098688] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 448e1b14-7b08-4172-9ff9-7e7ba7e84a68, please check neutron logs for more information. [ 745.098688] env[59620]: ERROR nova.compute.manager [ 745.098688] env[59620]: Traceback (most recent call last): [ 745.098688] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 745.098688] env[59620]: listener.cb(fileno) [ 745.098688] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 745.098688] env[59620]: result = function(*args, **kwargs) [ 745.098688] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 745.098688] env[59620]: return func(*args, **kwargs) [ 745.098688] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 745.098688] env[59620]: raise e [ 745.098688] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 745.098688] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 745.098688] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 745.098688] env[59620]: created_port_ids = self._update_ports_for_instance( [ 745.098688] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 745.098688] env[59620]: with excutils.save_and_reraise_exception(): [ 745.098688] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 745.098688] env[59620]: self.force_reraise() [ 745.098688] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 745.098688] env[59620]: raise self.value [ 745.098688] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 745.098688] env[59620]: updated_port = self._update_port( [ 745.098688] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 745.098688] env[59620]: _ensure_no_port_binding_failure(port) [ 745.098688] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 745.098688] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 745.099529] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 448e1b14-7b08-4172-9ff9-7e7ba7e84a68, please check neutron logs for more information. [ 745.099529] env[59620]: Removing descriptor: 19 [ 745.099529] env[59620]: ERROR nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 448e1b14-7b08-4172-9ff9-7e7ba7e84a68, please check neutron logs for more information. [ 745.099529] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Traceback (most recent call last): [ 745.099529] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 745.099529] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] yield resources [ 745.099529] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 745.099529] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] self.driver.spawn(context, instance, image_meta, [ 745.099529] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 745.099529] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 745.099529] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 745.099529] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] vm_ref = self.build_virtual_machine(instance, [ 745.099842] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 745.099842] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] vif_infos = vmwarevif.get_vif_info(self._session, [ 745.099842] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 745.099842] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] for vif in network_info: [ 745.099842] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 745.099842] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] return self._sync_wrapper(fn, *args, **kwargs) [ 745.099842] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 745.099842] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] self.wait() [ 745.099842] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 745.099842] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] self[:] = self._gt.wait() [ 745.099842] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 745.099842] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] return self._exit_event.wait() [ 745.099842] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 745.099842] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] result = hub.switch() [ 745.100205] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 745.100205] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] return self.greenlet.switch() [ 745.100205] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 745.100205] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] result = function(*args, **kwargs) [ 745.100205] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 745.100205] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] return func(*args, **kwargs) [ 745.100205] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 745.100205] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] raise e [ 745.100205] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 745.100205] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] nwinfo = self.network_api.allocate_for_instance( [ 745.100205] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 745.100205] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] created_port_ids = self._update_ports_for_instance( [ 745.100205] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 745.100582] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] with excutils.save_and_reraise_exception(): [ 745.100582] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 745.100582] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] self.force_reraise() [ 745.100582] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 745.100582] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] raise self.value [ 745.100582] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 745.100582] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] updated_port = self._update_port( [ 745.100582] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 745.100582] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] _ensure_no_port_binding_failure(port) [ 745.100582] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 745.100582] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] raise exception.PortBindingFailed(port_id=port['id']) [ 745.100582] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] nova.exception.PortBindingFailed: Binding failed for port 448e1b14-7b08-4172-9ff9-7e7ba7e84a68, please check neutron logs for more information. [ 745.100582] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] [ 745.100907] env[59620]: INFO nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Terminating instance [ 745.104391] env[59620]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "refresh_cache-6da1d5a5-ff2a-478d-97b8-bf237f844bae" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 745.104533] env[59620]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquired lock "refresh_cache-6da1d5a5-ff2a-478d-97b8-bf237f844bae" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 745.104636] env[59620]: DEBUG nova.network.neutron [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 745.199027] env[59620]: DEBUG nova.network.neutron [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 745.286196] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d4de92b-3edb-44a2-b976-8401cf441de4 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.295735] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75c14bfb-ac45-43b8-927f-6a710247fd1c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.331178] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-983a406d-d845-4403-bd53-684497b112bf {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.339312] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-755f41cf-336a-4741-93c1-aefb8a9ac907 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.353848] env[59620]: DEBUG nova.compute.provider_tree [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 745.362663] env[59620]: DEBUG nova.scheduler.client.report [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 745.379959] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.330s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 745.380466] env[59620]: DEBUG nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Start building networks asynchronously for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 745.424167] env[59620]: DEBUG nova.compute.utils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Using /dev/sd instead of None {{(pid=59620) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 745.424844] env[59620]: DEBUG nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Allocating IP information in the background. {{(pid=59620) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 745.425065] env[59620]: DEBUG nova.network.neutron [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] allocate_for_instance() {{(pid=59620) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 745.442590] env[59620]: DEBUG nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Start building block device mappings for instance. {{(pid=59620) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 745.532482] env[59620]: DEBUG nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Start spawning the instance on the hypervisor. {{(pid=59620) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 745.562470] env[59620]: DEBUG nova.virt.hardware [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-10-16T20:10:58Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-10-16T20:10:40Z,direct_url=,disk_format='vmdk',id=2efa4364-ba59-4de9-978f-169a769ee710,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='f77c77bd878b4e17895863a4a7a8191f',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-10-16T20:10:41Z,virtual_size=,visibility=), allow threads: False {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 745.562470] env[59620]: DEBUG nova.virt.hardware [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Flavor limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 745.562470] env[59620]: DEBUG nova.virt.hardware [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Image limits 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 745.562654] env[59620]: DEBUG nova.virt.hardware [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Flavor pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 745.562654] env[59620]: DEBUG nova.virt.hardware [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Image pref 0:0:0 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 745.562654] env[59620]: DEBUG nova.virt.hardware [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59620) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 745.562654] env[59620]: DEBUG nova.virt.hardware [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 745.562654] env[59620]: DEBUG nova.virt.hardware [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 745.562806] env[59620]: DEBUG nova.virt.hardware [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Got 1 possible topologies {{(pid=59620) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 745.562806] env[59620]: DEBUG nova.virt.hardware [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 745.562806] env[59620]: DEBUG nova.virt.hardware [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59620) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 745.565960] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd6e6d58-59a0-40a7-968c-e32c52590c34 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.573419] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f064a6b-17de-4c04-810b-ae905fca349b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.601654] env[59620]: DEBUG nova.policy [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e8748e6b463047f7b05683169bf0a0e7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7415f43b7da842daa77f717471dd89ac', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59620) authorize /opt/stack/nova/nova/policy.py:203}} [ 745.960532] env[59620]: DEBUG nova.network.neutron [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.972917] env[59620]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Releasing lock "refresh_cache-6da1d5a5-ff2a-478d-97b8-bf237f844bae" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 745.973328] env[59620]: DEBUG nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 745.973508] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 745.974061] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-93833c13-1891-43a4-af74-1afbb90bb954 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.984673] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad682919-8fad-49cc-8609-f8a5f81f6d7e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.006828] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6da1d5a5-ff2a-478d-97b8-bf237f844bae could not be found. [ 746.007213] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 746.007290] env[59620]: INFO nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Took 0.03 seconds to destroy the instance on the hypervisor. [ 746.007603] env[59620]: DEBUG oslo.service.loopingcall [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 746.008078] env[59620]: DEBUG nova.compute.manager [-] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 746.008078] env[59620]: DEBUG nova.network.neutron [-] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 746.176589] env[59620]: DEBUG nova.compute.manager [req-79a70462-71ca-4b60-8513-105ce001145b req-7891a946-20ba-4d81-8131-26fe3b56d68b service nova] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Received event network-changed-448e1b14-7b08-4172-9ff9-7e7ba7e84a68 {{(pid=59620) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 746.176781] env[59620]: DEBUG nova.compute.manager [req-79a70462-71ca-4b60-8513-105ce001145b req-7891a946-20ba-4d81-8131-26fe3b56d68b service nova] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Refreshing instance network info cache due to event network-changed-448e1b14-7b08-4172-9ff9-7e7ba7e84a68. {{(pid=59620) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 746.176993] env[59620]: DEBUG oslo_concurrency.lockutils [req-79a70462-71ca-4b60-8513-105ce001145b req-7891a946-20ba-4d81-8131-26fe3b56d68b service nova] Acquiring lock "refresh_cache-6da1d5a5-ff2a-478d-97b8-bf237f844bae" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 746.177144] env[59620]: DEBUG oslo_concurrency.lockutils [req-79a70462-71ca-4b60-8513-105ce001145b req-7891a946-20ba-4d81-8131-26fe3b56d68b service nova] Acquired lock "refresh_cache-6da1d5a5-ff2a-478d-97b8-bf237f844bae" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 746.177295] env[59620]: DEBUG nova.network.neutron [req-79a70462-71ca-4b60-8513-105ce001145b req-7891a946-20ba-4d81-8131-26fe3b56d68b service nova] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Refreshing network info cache for port 448e1b14-7b08-4172-9ff9-7e7ba7e84a68 {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 746.260041] env[59620]: DEBUG nova.network.neutron [req-79a70462-71ca-4b60-8513-105ce001145b req-7891a946-20ba-4d81-8131-26fe3b56d68b service nova] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 746.269065] env[59620]: DEBUG nova.network.neutron [-] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 746.276889] env[59620]: DEBUG nova.network.neutron [-] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 746.285711] env[59620]: INFO nova.compute.manager [-] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Took 0.28 seconds to deallocate network for instance. [ 746.287564] env[59620]: DEBUG nova.compute.claims [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 746.287731] env[59620]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 746.287933] env[59620]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 746.484091] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a62c69bf-64a5-4956-b2fe-328a4cc5b254 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.494240] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3592c6f0-bacc-4067-b5ff-fb40fbea1026 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.528825] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a047431e-de09-432e-842b-96faf513358d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.536638] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53bb4f3b-af8b-4d56-9907-7f27166eb736 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.552902] env[59620]: DEBUG nova.compute.provider_tree [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 746.568538] env[59620]: DEBUG nova.scheduler.client.report [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 746.591252] env[59620]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.300s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 746.591252] env[59620]: ERROR nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 448e1b14-7b08-4172-9ff9-7e7ba7e84a68, please check neutron logs for more information. [ 746.591252] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Traceback (most recent call last): [ 746.591252] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 746.591252] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] self.driver.spawn(context, instance, image_meta, [ 746.591252] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 746.591252] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] self._vmops.spawn(context, instance, image_meta, injected_files, [ 746.591252] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 746.591252] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] vm_ref = self.build_virtual_machine(instance, [ 746.591551] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 746.591551] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] vif_infos = vmwarevif.get_vif_info(self._session, [ 746.591551] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 746.591551] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] for vif in network_info: [ 746.591551] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 746.591551] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] return self._sync_wrapper(fn, *args, **kwargs) [ 746.591551] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 746.591551] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] self.wait() [ 746.591551] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 746.591551] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] self[:] = self._gt.wait() [ 746.591551] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 746.591551] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] return self._exit_event.wait() [ 746.591551] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 746.591551] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] result = hub.switch() [ 746.591942] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 746.591942] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] return self.greenlet.switch() [ 746.591942] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 746.591942] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] result = function(*args, **kwargs) [ 746.591942] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 746.591942] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] return func(*args, **kwargs) [ 746.591942] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 746.591942] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] raise e [ 746.591942] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 746.591942] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] nwinfo = self.network_api.allocate_for_instance( [ 746.591942] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 746.591942] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] created_port_ids = self._update_ports_for_instance( [ 746.591942] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 746.592278] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] with excutils.save_and_reraise_exception(): [ 746.592278] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 746.592278] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] self.force_reraise() [ 746.592278] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 746.592278] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] raise self.value [ 746.592278] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 746.592278] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] updated_port = self._update_port( [ 746.592278] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 746.592278] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] _ensure_no_port_binding_failure(port) [ 746.592278] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 746.592278] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] raise exception.PortBindingFailed(port_id=port['id']) [ 746.592278] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] nova.exception.PortBindingFailed: Binding failed for port 448e1b14-7b08-4172-9ff9-7e7ba7e84a68, please check neutron logs for more information. [ 746.592278] env[59620]: ERROR nova.compute.manager [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] [ 746.592727] env[59620]: DEBUG nova.compute.utils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Binding failed for port 448e1b14-7b08-4172-9ff9-7e7ba7e84a68, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 746.594963] env[59620]: DEBUG nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Build of instance 6da1d5a5-ff2a-478d-97b8-bf237f844bae was re-scheduled: Binding failed for port 448e1b14-7b08-4172-9ff9-7e7ba7e84a68, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 746.594963] env[59620]: DEBUG nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 746.595176] env[59620]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "refresh_cache-6da1d5a5-ff2a-478d-97b8-bf237f844bae" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 746.657760] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Successfully created port: 936854f8-ea3b-4b34-a17f-3a77b5316ae0 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 746.958017] env[59620]: DEBUG nova.network.neutron [req-79a70462-71ca-4b60-8513-105ce001145b req-7891a946-20ba-4d81-8131-26fe3b56d68b service nova] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 746.970373] env[59620]: DEBUG oslo_concurrency.lockutils [req-79a70462-71ca-4b60-8513-105ce001145b req-7891a946-20ba-4d81-8131-26fe3b56d68b service nova] Releasing lock "refresh_cache-6da1d5a5-ff2a-478d-97b8-bf237f844bae" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 746.970888] env[59620]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquired lock "refresh_cache-6da1d5a5-ff2a-478d-97b8-bf237f844bae" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 746.971065] env[59620]: DEBUG nova.network.neutron [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 747.141293] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Successfully created port: 41411f7c-9337-45a9-9fd8-f1fa18d4a0eb {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 747.259308] env[59620]: DEBUG nova.network.neutron [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 747.509902] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Successfully created port: adf341d9-a3d3-4ce9-97cf-6c6cfb94962d {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 747.789527] env[59620]: DEBUG nova.network.neutron [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 747.801707] env[59620]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Releasing lock "refresh_cache-6da1d5a5-ff2a-478d-97b8-bf237f844bae" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 747.803989] env[59620]: DEBUG nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 747.804216] env[59620]: DEBUG nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 747.804383] env[59620]: DEBUG nova.network.neutron [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 747.826506] env[59620]: DEBUG nova.network.neutron [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Successfully created port: 76a7ae5a-7d40-42c4-936d-36e078fb7820 {{(pid=59620) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 747.866740] env[59620]: DEBUG nova.network.neutron [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 747.880485] env[59620]: DEBUG nova.network.neutron [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 747.891696] env[59620]: INFO nova.compute.manager [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Took 0.08 seconds to deallocate network for instance. [ 748.009381] env[59620]: INFO nova.scheduler.client.report [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Deleted allocations for instance 6da1d5a5-ff2a-478d-97b8-bf237f844bae [ 748.035791] env[59620]: DEBUG oslo_concurrency.lockutils [None req-46e565a3-74ec-48e6-b6ce-9c9a760069f2 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "6da1d5a5-ff2a-478d-97b8-bf237f844bae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.689s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 748.041019] env[59620]: DEBUG oslo_concurrency.lockutils [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "6da1d5a5-ff2a-478d-97b8-bf237f844bae" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 15.397s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 748.041019] env[59620]: DEBUG oslo_concurrency.lockutils [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "6da1d5a5-ff2a-478d-97b8-bf237f844bae-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 748.041019] env[59620]: DEBUG oslo_concurrency.lockutils [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "6da1d5a5-ff2a-478d-97b8-bf237f844bae-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 748.041019] env[59620]: DEBUG oslo_concurrency.lockutils [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "6da1d5a5-ff2a-478d-97b8-bf237f844bae-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 748.043705] env[59620]: INFO nova.compute.manager [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Terminating instance [ 748.044327] env[59620]: DEBUG oslo_concurrency.lockutils [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquiring lock "refresh_cache-6da1d5a5-ff2a-478d-97b8-bf237f844bae" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 748.046057] env[59620]: DEBUG oslo_concurrency.lockutils [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Acquired lock "refresh_cache-6da1d5a5-ff2a-478d-97b8-bf237f844bae" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 748.046249] env[59620]: DEBUG nova.network.neutron [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 748.063160] env[59620]: ERROR nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 662701b8-6fd9-446f-92d4-22ec5381326d, please check neutron logs for more information. [ 748.063160] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 748.063160] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 748.063160] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 748.063160] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 748.063160] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 748.063160] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 748.063160] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 748.063160] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 748.063160] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 748.063160] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 748.063160] env[59620]: ERROR nova.compute.manager raise self.value [ 748.063160] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 748.063160] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 748.063160] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 748.063160] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 748.063610] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 748.063610] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 748.063610] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 662701b8-6fd9-446f-92d4-22ec5381326d, please check neutron logs for more information. [ 748.063610] env[59620]: ERROR nova.compute.manager [ 748.063610] env[59620]: Traceback (most recent call last): [ 748.063610] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 748.063610] env[59620]: listener.cb(fileno) [ 748.063610] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 748.063610] env[59620]: result = function(*args, **kwargs) [ 748.063610] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 748.063610] env[59620]: return func(*args, **kwargs) [ 748.063610] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 748.063610] env[59620]: raise e [ 748.063610] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 748.063610] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 748.063610] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 748.063610] env[59620]: created_port_ids = self._update_ports_for_instance( [ 748.063610] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 748.063610] env[59620]: with excutils.save_and_reraise_exception(): [ 748.063610] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 748.063610] env[59620]: self.force_reraise() [ 748.063610] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 748.063610] env[59620]: raise self.value [ 748.063610] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 748.063610] env[59620]: updated_port = self._update_port( [ 748.063610] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 748.063610] env[59620]: _ensure_no_port_binding_failure(port) [ 748.063610] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 748.063610] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 748.064360] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 662701b8-6fd9-446f-92d4-22ec5381326d, please check neutron logs for more information. [ 748.064360] env[59620]: Removing descriptor: 13 [ 748.068509] env[59620]: ERROR nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 662701b8-6fd9-446f-92d4-22ec5381326d, please check neutron logs for more information. [ 748.068509] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Traceback (most recent call last): [ 748.068509] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 748.068509] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] yield resources [ 748.068509] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 748.068509] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] self.driver.spawn(context, instance, image_meta, [ 748.068509] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 748.068509] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 748.068509] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 748.068509] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] vm_ref = self.build_virtual_machine(instance, [ 748.068509] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 748.068840] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] vif_infos = vmwarevif.get_vif_info(self._session, [ 748.068840] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 748.068840] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] for vif in network_info: [ 748.068840] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 748.068840] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] return self._sync_wrapper(fn, *args, **kwargs) [ 748.068840] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 748.068840] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] self.wait() [ 748.068840] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 748.068840] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] self[:] = self._gt.wait() [ 748.068840] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 748.068840] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] return self._exit_event.wait() [ 748.068840] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 748.068840] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] result = hub.switch() [ 748.068840] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 748.069546] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] return self.greenlet.switch() [ 748.069546] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 748.069546] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] result = function(*args, **kwargs) [ 748.069546] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 748.069546] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] return func(*args, **kwargs) [ 748.069546] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 748.069546] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] raise e [ 748.069546] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 748.069546] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] nwinfo = self.network_api.allocate_for_instance( [ 748.069546] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 748.069546] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] created_port_ids = self._update_ports_for_instance( [ 748.069546] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 748.069546] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] with excutils.save_and_reraise_exception(): [ 748.069909] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 748.069909] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] self.force_reraise() [ 748.069909] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 748.069909] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] raise self.value [ 748.069909] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 748.069909] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] updated_port = self._update_port( [ 748.069909] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 748.069909] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] _ensure_no_port_binding_failure(port) [ 748.069909] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 748.069909] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] raise exception.PortBindingFailed(port_id=port['id']) [ 748.069909] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] nova.exception.PortBindingFailed: Binding failed for port 662701b8-6fd9-446f-92d4-22ec5381326d, please check neutron logs for more information. [ 748.069909] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] [ 748.070263] env[59620]: INFO nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Terminating instance [ 748.073274] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Acquiring lock "refresh_cache-d85db5e9-ce70-477d-bb5c-7665ab69b19a" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 748.073274] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Acquired lock "refresh_cache-d85db5e9-ce70-477d-bb5c-7665ab69b19a" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 748.073274] env[59620]: DEBUG nova.network.neutron [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 748.341214] env[59620]: DEBUG nova.network.neutron [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 748.412081] env[59620]: DEBUG nova.network.neutron [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 748.448114] env[59620]: ERROR nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 6319fa19-1380-4966-b525-c126a2fb5462, please check neutron logs for more information. [ 748.448114] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 748.448114] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 748.448114] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 748.448114] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 748.448114] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 748.448114] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 748.448114] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 748.448114] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 748.448114] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 748.448114] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 748.448114] env[59620]: ERROR nova.compute.manager raise self.value [ 748.448114] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 748.448114] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 748.448114] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 748.448114] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 748.448612] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 748.448612] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 748.448612] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 6319fa19-1380-4966-b525-c126a2fb5462, please check neutron logs for more information. [ 748.448612] env[59620]: ERROR nova.compute.manager [ 748.448612] env[59620]: Traceback (most recent call last): [ 748.448612] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 748.448612] env[59620]: listener.cb(fileno) [ 748.448612] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 748.448612] env[59620]: result = function(*args, **kwargs) [ 748.448612] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 748.448612] env[59620]: return func(*args, **kwargs) [ 748.448612] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 748.448612] env[59620]: raise e [ 748.448612] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 748.448612] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 748.448612] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 748.448612] env[59620]: created_port_ids = self._update_ports_for_instance( [ 748.448612] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 748.448612] env[59620]: with excutils.save_and_reraise_exception(): [ 748.448612] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 748.448612] env[59620]: self.force_reraise() [ 748.448612] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 748.448612] env[59620]: raise self.value [ 748.448612] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 748.448612] env[59620]: updated_port = self._update_port( [ 748.448612] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 748.448612] env[59620]: _ensure_no_port_binding_failure(port) [ 748.448612] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 748.448612] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 748.449733] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 6319fa19-1380-4966-b525-c126a2fb5462, please check neutron logs for more information. [ 748.449733] env[59620]: Removing descriptor: 14 [ 748.449733] env[59620]: ERROR nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 6319fa19-1380-4966-b525-c126a2fb5462, please check neutron logs for more information. [ 748.449733] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] Traceback (most recent call last): [ 748.449733] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 748.449733] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] yield resources [ 748.449733] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 748.449733] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] self.driver.spawn(context, instance, image_meta, [ 748.449733] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 748.449733] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 748.449733] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 748.449733] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] vm_ref = self.build_virtual_machine(instance, [ 748.450133] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 748.450133] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] vif_infos = vmwarevif.get_vif_info(self._session, [ 748.450133] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 748.450133] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] for vif in network_info: [ 748.450133] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 748.450133] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] return self._sync_wrapper(fn, *args, **kwargs) [ 748.450133] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 748.450133] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] self.wait() [ 748.450133] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 748.450133] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] self[:] = self._gt.wait() [ 748.450133] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 748.450133] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] return self._exit_event.wait() [ 748.450133] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 748.450133] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] result = hub.switch() [ 748.450543] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 748.450543] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] return self.greenlet.switch() [ 748.450543] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 748.450543] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] result = function(*args, **kwargs) [ 748.450543] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 748.450543] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] return func(*args, **kwargs) [ 748.450543] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 748.450543] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] raise e [ 748.450543] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 748.450543] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] nwinfo = self.network_api.allocate_for_instance( [ 748.450543] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 748.450543] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] created_port_ids = self._update_ports_for_instance( [ 748.450543] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 748.450954] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] with excutils.save_and_reraise_exception(): [ 748.450954] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 748.450954] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] self.force_reraise() [ 748.450954] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 748.450954] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] raise self.value [ 748.450954] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 748.450954] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] updated_port = self._update_port( [ 748.450954] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 748.450954] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] _ensure_no_port_binding_failure(port) [ 748.450954] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 748.450954] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] raise exception.PortBindingFailed(port_id=port['id']) [ 748.450954] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] nova.exception.PortBindingFailed: Binding failed for port 6319fa19-1380-4966-b525-c126a2fb5462, please check neutron logs for more information. [ 748.450954] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] [ 748.451347] env[59620]: INFO nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Terminating instance [ 748.452942] env[59620]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "refresh_cache-7b5558e4-05fc-4755-accf-77228272884f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 748.453112] env[59620]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquired lock "refresh_cache-7b5558e4-05fc-4755-accf-77228272884f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 748.453274] env[59620]: DEBUG nova.network.neutron [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 748.728025] env[59620]: DEBUG nova.network.neutron [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 748.921201] env[59620]: DEBUG nova.network.neutron [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 748.928238] env[59620]: DEBUG nova.network.neutron [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 748.935355] env[59620]: DEBUG oslo_concurrency.lockutils [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Releasing lock "refresh_cache-6da1d5a5-ff2a-478d-97b8-bf237f844bae" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 748.935499] env[59620]: DEBUG nova.compute.manager [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 748.935617] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 748.936254] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-70e0173a-e4bb-4c6e-9b0d-20905284017e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.939514] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Releasing lock "refresh_cache-d85db5e9-ce70-477d-bb5c-7665ab69b19a" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 748.940024] env[59620]: DEBUG nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 748.940211] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 748.941042] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fd776161-cb9c-4b0f-a203-2ed736d88412 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.949871] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d5f0608-b7cd-4884-bbb5-383088e37ca6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.968258] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c90728fa-4bc5-4ab2-8e5f-77993920ef50 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.987558] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6da1d5a5-ff2a-478d-97b8-bf237f844bae could not be found. [ 748.987755] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 748.987925] env[59620]: INFO nova.compute.manager [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Took 0.05 seconds to destroy the instance on the hypervisor. [ 748.988723] env[59620]: DEBUG oslo.service.loopingcall [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 748.989096] env[59620]: DEBUG nova.compute.manager [-] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 748.989250] env[59620]: DEBUG nova.network.neutron [-] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 749.000177] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d85db5e9-ce70-477d-bb5c-7665ab69b19a could not be found. [ 749.000398] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 749.000593] env[59620]: INFO nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Took 0.06 seconds to destroy the instance on the hypervisor. [ 749.000823] env[59620]: DEBUG oslo.service.loopingcall [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 749.001068] env[59620]: DEBUG nova.compute.manager [-] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 749.001465] env[59620]: DEBUG nova.network.neutron [-] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 749.072791] env[59620]: DEBUG nova.network.neutron [-] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 749.088406] env[59620]: DEBUG nova.network.neutron [-] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 749.090210] env[59620]: DEBUG nova.network.neutron [-] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 749.102077] env[59620]: DEBUG nova.network.neutron [-] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 749.103152] env[59620]: INFO nova.compute.manager [-] [instance: 6da1d5a5-ff2a-478d-97b8-bf237f844bae] Took 0.11 seconds to deallocate network for instance. [ 749.114446] env[59620]: INFO nova.compute.manager [-] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Took 0.11 seconds to deallocate network for instance. [ 749.122688] env[59620]: DEBUG nova.compute.claims [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 749.122688] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 749.122688] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 749.250499] env[59620]: DEBUG oslo_concurrency.lockutils [None req-6e475844-38a1-4d7f-8805-9d7f50da0778 tempest-ServersTestJSON-864844613 tempest-ServersTestJSON-864844613-project-member] Lock "6da1d5a5-ff2a-478d-97b8-bf237f844bae" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.214s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 749.333068] env[59620]: DEBUG nova.network.neutron [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 749.336028] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6eb62e8-507b-4145-8bfc-3374eaedc80e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.345291] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31f4bebd-4dad-41fd-9291-99f7d249f173 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.349487] env[59620]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Releasing lock "refresh_cache-7b5558e4-05fc-4755-accf-77228272884f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 749.349893] env[59620]: DEBUG nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 749.350099] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 749.351223] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6edf987d-07ef-46c6-aa6e-c6740eb0b058 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.388577] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbcbbc71-09db-404d-b403-c1f8d0c71929 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.395472] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95097a65-9104-4f1b-b279-df783fc4c745 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.410761] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe6e9802-4238-4b4c-bca5-d7f203564525 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.424706] env[59620]: DEBUG nova.compute.provider_tree [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 749.430247] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7b5558e4-05fc-4755-accf-77228272884f could not be found. [ 749.430490] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 749.431090] env[59620]: INFO nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Took 0.08 seconds to destroy the instance on the hypervisor. [ 749.431438] env[59620]: DEBUG oslo.service.loopingcall [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 749.431794] env[59620]: DEBUG nova.compute.manager [-] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 749.431893] env[59620]: DEBUG nova.network.neutron [-] [instance: 7b5558e4-05fc-4755-accf-77228272884f] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 749.435652] env[59620]: DEBUG nova.scheduler.client.report [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 749.450196] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.329s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 749.450836] env[59620]: ERROR nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 662701b8-6fd9-446f-92d4-22ec5381326d, please check neutron logs for more information. [ 749.450836] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Traceback (most recent call last): [ 749.450836] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 749.450836] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] self.driver.spawn(context, instance, image_meta, [ 749.450836] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 749.450836] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 749.450836] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 749.450836] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] vm_ref = self.build_virtual_machine(instance, [ 749.450836] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 749.450836] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] vif_infos = vmwarevif.get_vif_info(self._session, [ 749.450836] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 749.451205] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] for vif in network_info: [ 749.451205] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 749.451205] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] return self._sync_wrapper(fn, *args, **kwargs) [ 749.451205] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 749.451205] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] self.wait() [ 749.451205] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 749.451205] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] self[:] = self._gt.wait() [ 749.451205] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 749.451205] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] return self._exit_event.wait() [ 749.451205] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 749.451205] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] result = hub.switch() [ 749.451205] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 749.451205] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] return self.greenlet.switch() [ 749.451205] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 749.451569] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] result = function(*args, **kwargs) [ 749.451569] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 749.451569] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] return func(*args, **kwargs) [ 749.451569] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 749.451569] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] raise e [ 749.451569] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 749.451569] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] nwinfo = self.network_api.allocate_for_instance( [ 749.451569] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 749.451569] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] created_port_ids = self._update_ports_for_instance( [ 749.451569] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 749.451569] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] with excutils.save_and_reraise_exception(): [ 749.451569] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 749.451569] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] self.force_reraise() [ 749.451883] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 749.451883] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] raise self.value [ 749.451883] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 749.451883] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] updated_port = self._update_port( [ 749.451883] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 749.451883] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] _ensure_no_port_binding_failure(port) [ 749.451883] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 749.451883] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] raise exception.PortBindingFailed(port_id=port['id']) [ 749.451883] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] nova.exception.PortBindingFailed: Binding failed for port 662701b8-6fd9-446f-92d4-22ec5381326d, please check neutron logs for more information. [ 749.451883] env[59620]: ERROR nova.compute.manager [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] [ 749.451883] env[59620]: DEBUG nova.compute.utils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Binding failed for port 662701b8-6fd9-446f-92d4-22ec5381326d, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 749.453410] env[59620]: DEBUG nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Build of instance d85db5e9-ce70-477d-bb5c-7665ab69b19a was re-scheduled: Binding failed for port 662701b8-6fd9-446f-92d4-22ec5381326d, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 749.453945] env[59620]: DEBUG nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 749.454232] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Acquiring lock "refresh_cache-d85db5e9-ce70-477d-bb5c-7665ab69b19a" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 749.454410] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Acquired lock "refresh_cache-d85db5e9-ce70-477d-bb5c-7665ab69b19a" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 749.454598] env[59620]: DEBUG nova.network.neutron [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 749.492630] env[59620]: DEBUG nova.network.neutron [-] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 749.507998] env[59620]: DEBUG nova.network.neutron [-] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 749.518761] env[59620]: INFO nova.compute.manager [-] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Took 0.09 seconds to deallocate network for instance. [ 749.521800] env[59620]: DEBUG nova.compute.claims [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 749.521956] env[59620]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 749.522111] env[59620]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 749.619696] env[59620]: DEBUG nova.network.neutron [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 749.729632] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56349f8d-32e3-45be-a540-4a790af474dd {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.737548] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7874c75c-e8c1-4dd6-964d-d0820d175ff9 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.768655] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1d65b1b-d8b5-4f4a-9996-094350c73281 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.776523] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58b9d95c-6f3d-4276-b42e-af531001a9d6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.792353] env[59620]: DEBUG nova.compute.provider_tree [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 749.805086] env[59620]: DEBUG nova.scheduler.client.report [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 749.825944] env[59620]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.304s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 749.826907] env[59620]: ERROR nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 6319fa19-1380-4966-b525-c126a2fb5462, please check neutron logs for more information. [ 749.826907] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] Traceback (most recent call last): [ 749.826907] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 749.826907] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] self.driver.spawn(context, instance, image_meta, [ 749.826907] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 749.826907] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 749.826907] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 749.826907] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] vm_ref = self.build_virtual_machine(instance, [ 749.826907] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 749.826907] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] vif_infos = vmwarevif.get_vif_info(self._session, [ 749.826907] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 749.827312] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] for vif in network_info: [ 749.827312] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 749.827312] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] return self._sync_wrapper(fn, *args, **kwargs) [ 749.827312] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 749.827312] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] self.wait() [ 749.827312] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 749.827312] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] self[:] = self._gt.wait() [ 749.827312] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 749.827312] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] return self._exit_event.wait() [ 749.827312] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 749.827312] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] result = hub.switch() [ 749.827312] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 749.827312] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] return self.greenlet.switch() [ 749.827312] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 749.827712] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] result = function(*args, **kwargs) [ 749.827712] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 749.827712] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] return func(*args, **kwargs) [ 749.827712] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 749.827712] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] raise e [ 749.827712] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 749.827712] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] nwinfo = self.network_api.allocate_for_instance( [ 749.827712] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 749.827712] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] created_port_ids = self._update_ports_for_instance( [ 749.827712] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 749.827712] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] with excutils.save_and_reraise_exception(): [ 749.827712] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 749.827712] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] self.force_reraise() [ 749.828075] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 749.828075] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] raise self.value [ 749.828075] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 749.828075] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] updated_port = self._update_port( [ 749.828075] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 749.828075] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] _ensure_no_port_binding_failure(port) [ 749.828075] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 749.828075] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] raise exception.PortBindingFailed(port_id=port['id']) [ 749.828075] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] nova.exception.PortBindingFailed: Binding failed for port 6319fa19-1380-4966-b525-c126a2fb5462, please check neutron logs for more information. [ 749.828075] env[59620]: ERROR nova.compute.manager [instance: 7b5558e4-05fc-4755-accf-77228272884f] [ 749.828320] env[59620]: DEBUG nova.compute.utils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Binding failed for port 6319fa19-1380-4966-b525-c126a2fb5462, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 749.829176] env[59620]: DEBUG nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Build of instance 7b5558e4-05fc-4755-accf-77228272884f was re-scheduled: Binding failed for port 6319fa19-1380-4966-b525-c126a2fb5462, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 749.829176] env[59620]: DEBUG nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 749.829618] env[59620]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquiring lock "refresh_cache-7b5558e4-05fc-4755-accf-77228272884f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 749.829813] env[59620]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Acquired lock "refresh_cache-7b5558e4-05fc-4755-accf-77228272884f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 749.829979] env[59620]: DEBUG nova.network.neutron [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 749.919884] env[59620]: DEBUG nova.network.neutron [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 750.070540] env[59620]: DEBUG nova.network.neutron [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 750.080670] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Releasing lock "refresh_cache-d85db5e9-ce70-477d-bb5c-7665ab69b19a" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 750.080979] env[59620]: DEBUG nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 750.081110] env[59620]: DEBUG nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 750.081270] env[59620]: DEBUG nova.network.neutron [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 750.133793] env[59620]: DEBUG nova.network.neutron [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 750.144154] env[59620]: DEBUG nova.network.neutron [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 750.155278] env[59620]: INFO nova.compute.manager [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] [instance: d85db5e9-ce70-477d-bb5c-7665ab69b19a] Took 0.07 seconds to deallocate network for instance. [ 750.263351] env[59620]: INFO nova.scheduler.client.report [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Deleted allocations for instance d85db5e9-ce70-477d-bb5c-7665ab69b19a [ 750.286698] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4037931b-b729-4ca5-abe3-1dc67c9f6151 tempest-AttachInterfacesV270Test-2033217154 tempest-AttachInterfacesV270Test-2033217154-project-member] Lock "d85db5e9-ce70-477d-bb5c-7665ab69b19a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.952s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 750.431165] env[59620]: DEBUG nova.network.neutron [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 750.442676] env[59620]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Releasing lock "refresh_cache-7b5558e4-05fc-4755-accf-77228272884f" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 750.443015] env[59620]: DEBUG nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 750.443241] env[59620]: DEBUG nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 750.443412] env[59620]: DEBUG nova.network.neutron [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 750.546455] env[59620]: DEBUG nova.network.neutron [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 750.556516] env[59620]: DEBUG nova.network.neutron [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 750.567962] env[59620]: INFO nova.compute.manager [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] [instance: 7b5558e4-05fc-4755-accf-77228272884f] Took 0.12 seconds to deallocate network for instance. [ 750.671101] env[59620]: INFO nova.scheduler.client.report [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Deleted allocations for instance 7b5558e4-05fc-4755-accf-77228272884f [ 750.689625] env[59620]: DEBUG oslo_concurrency.lockutils [None req-155c1f43-94bf-420d-962a-e1ffeec3b864 tempest-AttachVolumeShelveTestJSON-1392592154 tempest-AttachVolumeShelveTestJSON-1392592154-project-member] Lock "7b5558e4-05fc-4755-accf-77228272884f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.940s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 752.275650] env[59620]: ERROR nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port ba41efcb-e78a-4f3e-a135-67f3325e12a4, please check neutron logs for more information. [ 752.275650] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 752.275650] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 752.275650] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 752.275650] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 752.275650] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 752.275650] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 752.275650] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 752.275650] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 752.275650] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 752.275650] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 752.275650] env[59620]: ERROR nova.compute.manager raise self.value [ 752.275650] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 752.275650] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 752.275650] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 752.275650] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 752.276275] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 752.276275] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 752.276275] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port ba41efcb-e78a-4f3e-a135-67f3325e12a4, please check neutron logs for more information. [ 752.276275] env[59620]: ERROR nova.compute.manager [ 752.276275] env[59620]: Traceback (most recent call last): [ 752.276275] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 752.276275] env[59620]: listener.cb(fileno) [ 752.276275] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 752.276275] env[59620]: result = function(*args, **kwargs) [ 752.276275] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 752.276275] env[59620]: return func(*args, **kwargs) [ 752.276275] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 752.276275] env[59620]: raise e [ 752.276275] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 752.276275] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 752.276275] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 752.276275] env[59620]: created_port_ids = self._update_ports_for_instance( [ 752.276275] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 752.276275] env[59620]: with excutils.save_and_reraise_exception(): [ 752.276275] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 752.276275] env[59620]: self.force_reraise() [ 752.276275] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 752.276275] env[59620]: raise self.value [ 752.276275] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 752.276275] env[59620]: updated_port = self._update_port( [ 752.276275] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 752.276275] env[59620]: _ensure_no_port_binding_failure(port) [ 752.276275] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 752.276275] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 752.277359] env[59620]: nova.exception.PortBindingFailed: Binding failed for port ba41efcb-e78a-4f3e-a135-67f3325e12a4, please check neutron logs for more information. [ 752.277359] env[59620]: Removing descriptor: 22 [ 752.277359] env[59620]: ERROR nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port ba41efcb-e78a-4f3e-a135-67f3325e12a4, please check neutron logs for more information. [ 752.277359] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Traceback (most recent call last): [ 752.277359] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 752.277359] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] yield resources [ 752.277359] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 752.277359] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] self.driver.spawn(context, instance, image_meta, [ 752.277359] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 752.277359] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 752.277359] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 752.277359] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] vm_ref = self.build_virtual_machine(instance, [ 752.278609] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 752.278609] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] vif_infos = vmwarevif.get_vif_info(self._session, [ 752.278609] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 752.278609] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] for vif in network_info: [ 752.278609] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 752.278609] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] return self._sync_wrapper(fn, *args, **kwargs) [ 752.278609] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 752.278609] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] self.wait() [ 752.278609] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 752.278609] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] self[:] = self._gt.wait() [ 752.278609] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 752.278609] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] return self._exit_event.wait() [ 752.278609] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 752.278609] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] result = hub.switch() [ 752.279050] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 752.279050] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] return self.greenlet.switch() [ 752.279050] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 752.279050] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] result = function(*args, **kwargs) [ 752.279050] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 752.279050] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] return func(*args, **kwargs) [ 752.279050] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 752.279050] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] raise e [ 752.279050] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 752.279050] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] nwinfo = self.network_api.allocate_for_instance( [ 752.279050] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 752.279050] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] created_port_ids = self._update_ports_for_instance( [ 752.279050] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 752.279663] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] with excutils.save_and_reraise_exception(): [ 752.279663] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 752.279663] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] self.force_reraise() [ 752.279663] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 752.279663] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] raise self.value [ 752.279663] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 752.279663] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] updated_port = self._update_port( [ 752.279663] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 752.279663] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] _ensure_no_port_binding_failure(port) [ 752.279663] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 752.279663] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] raise exception.PortBindingFailed(port_id=port['id']) [ 752.279663] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] nova.exception.PortBindingFailed: Binding failed for port ba41efcb-e78a-4f3e-a135-67f3325e12a4, please check neutron logs for more information. [ 752.279663] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] [ 752.280074] env[59620]: INFO nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Terminating instance [ 752.283051] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "refresh_cache-fea7d2f4-199d-4c76-84cd-4ee7820990ec" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 752.284746] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquired lock "refresh_cache-fea7d2f4-199d-4c76-84cd-4ee7820990ec" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 752.284950] env[59620]: DEBUG nova.network.neutron [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 752.372848] env[59620]: DEBUG nova.network.neutron [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 752.939228] env[59620]: ERROR nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port a4475db3-1cfd-47df-9fd1-b9a0d4fa35a1, please check neutron logs for more information. [ 752.939228] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 752.939228] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 752.939228] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 752.939228] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 752.939228] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 752.939228] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 752.939228] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 752.939228] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 752.939228] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 752.939228] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 752.939228] env[59620]: ERROR nova.compute.manager raise self.value [ 752.939228] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 752.939228] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 752.939228] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 752.939228] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 752.939950] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 752.939950] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 752.939950] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port a4475db3-1cfd-47df-9fd1-b9a0d4fa35a1, please check neutron logs for more information. [ 752.939950] env[59620]: ERROR nova.compute.manager [ 752.939950] env[59620]: Traceback (most recent call last): [ 752.939950] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 752.939950] env[59620]: listener.cb(fileno) [ 752.939950] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 752.939950] env[59620]: result = function(*args, **kwargs) [ 752.939950] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 752.939950] env[59620]: return func(*args, **kwargs) [ 752.939950] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 752.939950] env[59620]: raise e [ 752.939950] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 752.939950] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 752.939950] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 752.939950] env[59620]: created_port_ids = self._update_ports_for_instance( [ 752.939950] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 752.939950] env[59620]: with excutils.save_and_reraise_exception(): [ 752.939950] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 752.939950] env[59620]: self.force_reraise() [ 752.939950] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 752.939950] env[59620]: raise self.value [ 752.939950] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 752.939950] env[59620]: updated_port = self._update_port( [ 752.939950] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 752.939950] env[59620]: _ensure_no_port_binding_failure(port) [ 752.939950] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 752.939950] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 752.942096] env[59620]: nova.exception.PortBindingFailed: Binding failed for port a4475db3-1cfd-47df-9fd1-b9a0d4fa35a1, please check neutron logs for more information. [ 752.942096] env[59620]: Removing descriptor: 15 [ 752.942096] env[59620]: ERROR nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port a4475db3-1cfd-47df-9fd1-b9a0d4fa35a1, please check neutron logs for more information. [ 752.942096] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Traceback (most recent call last): [ 752.942096] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 752.942096] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] yield resources [ 752.942096] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 752.942096] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] self.driver.spawn(context, instance, image_meta, [ 752.942096] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 752.942096] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] self._vmops.spawn(context, instance, image_meta, injected_files, [ 752.942096] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 752.942096] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] vm_ref = self.build_virtual_machine(instance, [ 752.942412] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 752.942412] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] vif_infos = vmwarevif.get_vif_info(self._session, [ 752.942412] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 752.942412] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] for vif in network_info: [ 752.942412] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 752.942412] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] return self._sync_wrapper(fn, *args, **kwargs) [ 752.942412] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 752.942412] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] self.wait() [ 752.942412] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 752.942412] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] self[:] = self._gt.wait() [ 752.942412] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 752.942412] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] return self._exit_event.wait() [ 752.942412] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 752.942412] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] result = hub.switch() [ 752.942758] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 752.942758] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] return self.greenlet.switch() [ 752.942758] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 752.942758] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] result = function(*args, **kwargs) [ 752.942758] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 752.942758] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] return func(*args, **kwargs) [ 752.942758] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 752.942758] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] raise e [ 752.942758] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 752.942758] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] nwinfo = self.network_api.allocate_for_instance( [ 752.942758] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 752.942758] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] created_port_ids = self._update_ports_for_instance( [ 752.942758] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 752.943141] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] with excutils.save_and_reraise_exception(): [ 752.943141] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 752.943141] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] self.force_reraise() [ 752.943141] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 752.943141] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] raise self.value [ 752.943141] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 752.943141] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] updated_port = self._update_port( [ 752.943141] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 752.943141] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] _ensure_no_port_binding_failure(port) [ 752.943141] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 752.943141] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] raise exception.PortBindingFailed(port_id=port['id']) [ 752.943141] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] nova.exception.PortBindingFailed: Binding failed for port a4475db3-1cfd-47df-9fd1-b9a0d4fa35a1, please check neutron logs for more information. [ 752.943141] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] [ 752.943551] env[59620]: INFO nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Terminating instance [ 752.945820] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Acquiring lock "refresh_cache-671639d6-3103-4eeb-86d3-b858a3919396" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 752.945982] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Acquired lock "refresh_cache-671639d6-3103-4eeb-86d3-b858a3919396" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 752.946158] env[59620]: DEBUG nova.network.neutron [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 752.988239] env[59620]: DEBUG nova.network.neutron [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 752.996994] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Releasing lock "refresh_cache-fea7d2f4-199d-4c76-84cd-4ee7820990ec" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 752.997414] env[59620]: DEBUG nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 752.997597] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 752.998105] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-de082ec9-37ae-44a9-a3e6-0eb6252c8ad5 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.000969] env[59620]: ERROR nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 936854f8-ea3b-4b34-a17f-3a77b5316ae0, please check neutron logs for more information. [ 753.000969] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 753.000969] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 753.000969] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 753.000969] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 753.000969] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 753.000969] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 753.000969] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 753.000969] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 753.000969] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 753.000969] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 753.000969] env[59620]: ERROR nova.compute.manager raise self.value [ 753.000969] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 753.000969] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 753.000969] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 753.000969] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 753.001460] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 753.001460] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 753.001460] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 936854f8-ea3b-4b34-a17f-3a77b5316ae0, please check neutron logs for more information. [ 753.001460] env[59620]: ERROR nova.compute.manager [ 753.001460] env[59620]: Traceback (most recent call last): [ 753.001460] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 753.001460] env[59620]: listener.cb(fileno) [ 753.001460] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 753.001460] env[59620]: result = function(*args, **kwargs) [ 753.001460] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 753.001460] env[59620]: return func(*args, **kwargs) [ 753.001460] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 753.001460] env[59620]: raise e [ 753.001460] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 753.001460] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 753.001460] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 753.001460] env[59620]: created_port_ids = self._update_ports_for_instance( [ 753.001460] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 753.001460] env[59620]: with excutils.save_and_reraise_exception(): [ 753.001460] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 753.001460] env[59620]: self.force_reraise() [ 753.001460] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 753.001460] env[59620]: raise self.value [ 753.001460] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 753.001460] env[59620]: updated_port = self._update_port( [ 753.001460] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 753.001460] env[59620]: _ensure_no_port_binding_failure(port) [ 753.001460] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 753.001460] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 753.002471] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 936854f8-ea3b-4b34-a17f-3a77b5316ae0, please check neutron logs for more information. [ 753.002471] env[59620]: Removing descriptor: 11 [ 753.002471] env[59620]: ERROR nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 936854f8-ea3b-4b34-a17f-3a77b5316ae0, please check neutron logs for more information. [ 753.002471] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Traceback (most recent call last): [ 753.002471] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 753.002471] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] yield resources [ 753.002471] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 753.002471] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] self.driver.spawn(context, instance, image_meta, [ 753.002471] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 753.002471] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 753.002471] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 753.002471] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] vm_ref = self.build_virtual_machine(instance, [ 753.002803] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 753.002803] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] vif_infos = vmwarevif.get_vif_info(self._session, [ 753.002803] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 753.002803] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] for vif in network_info: [ 753.002803] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 753.002803] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] return self._sync_wrapper(fn, *args, **kwargs) [ 753.002803] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 753.002803] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] self.wait() [ 753.002803] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 753.002803] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] self[:] = self._gt.wait() [ 753.002803] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 753.002803] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] return self._exit_event.wait() [ 753.002803] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 753.002803] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] result = hub.switch() [ 753.003208] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 753.003208] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] return self.greenlet.switch() [ 753.003208] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 753.003208] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] result = function(*args, **kwargs) [ 753.003208] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 753.003208] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] return func(*args, **kwargs) [ 753.003208] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 753.003208] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] raise e [ 753.003208] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 753.003208] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] nwinfo = self.network_api.allocate_for_instance( [ 753.003208] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 753.003208] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] created_port_ids = self._update_ports_for_instance( [ 753.003208] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 753.003565] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] with excutils.save_and_reraise_exception(): [ 753.003565] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 753.003565] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] self.force_reraise() [ 753.003565] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 753.003565] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] raise self.value [ 753.003565] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 753.003565] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] updated_port = self._update_port( [ 753.003565] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 753.003565] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] _ensure_no_port_binding_failure(port) [ 753.003565] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 753.003565] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] raise exception.PortBindingFailed(port_id=port['id']) [ 753.003565] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] nova.exception.PortBindingFailed: Binding failed for port 936854f8-ea3b-4b34-a17f-3a77b5316ae0, please check neutron logs for more information. [ 753.003565] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] [ 753.003904] env[59620]: INFO nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Terminating instance [ 753.005179] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "refresh_cache-5c21177e-6cff-414f-bff1-bac166929cab" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 753.005251] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquired lock "refresh_cache-5c21177e-6cff-414f-bff1-bac166929cab" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 753.005377] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 753.011753] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b1c2913-acd3-48c4-af2f-97c2fba8bcff {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.033010] env[59620]: DEBUG nova.network.neutron [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 753.041327] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fea7d2f4-199d-4c76-84cd-4ee7820990ec could not be found. [ 753.041532] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 753.041707] env[59620]: INFO nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Took 0.04 seconds to destroy the instance on the hypervisor. [ 753.041935] env[59620]: DEBUG oslo.service.loopingcall [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 753.042151] env[59620]: DEBUG nova.compute.manager [-] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 753.042244] env[59620]: DEBUG nova.network.neutron [-] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 753.090235] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 753.330055] env[59620]: DEBUG nova.network.neutron [-] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 753.338495] env[59620]: DEBUG nova.network.neutron [-] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 753.353290] env[59620]: INFO nova.compute.manager [-] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Took 0.31 seconds to deallocate network for instance. [ 753.355283] env[59620]: DEBUG nova.compute.claims [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 753.355463] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 753.355671] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 753.542787] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8c1859c-92e6-4410-ab82-0fff759df7bf {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.553793] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-385588eb-7cb3-4eb3-8e96-408d9f23de66 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.588490] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71a0b70c-7fe5-4f66-9150-d8006bfc0149 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.596613] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54bcdd40-543b-46a6-b57b-f0ba50227c13 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.612299] env[59620]: DEBUG nova.compute.provider_tree [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 753.621522] env[59620]: DEBUG nova.scheduler.client.report [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 753.641419] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.285s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.641733] env[59620]: ERROR nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port ba41efcb-e78a-4f3e-a135-67f3325e12a4, please check neutron logs for more information. [ 753.641733] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Traceback (most recent call last): [ 753.641733] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 753.641733] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] self.driver.spawn(context, instance, image_meta, [ 753.641733] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 753.641733] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 753.641733] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 753.641733] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] vm_ref = self.build_virtual_machine(instance, [ 753.641733] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 753.641733] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] vif_infos = vmwarevif.get_vif_info(self._session, [ 753.641733] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 753.642170] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] for vif in network_info: [ 753.642170] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 753.642170] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] return self._sync_wrapper(fn, *args, **kwargs) [ 753.642170] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 753.642170] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] self.wait() [ 753.642170] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 753.642170] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] self[:] = self._gt.wait() [ 753.642170] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 753.642170] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] return self._exit_event.wait() [ 753.642170] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 753.642170] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] result = hub.switch() [ 753.642170] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 753.642170] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] return self.greenlet.switch() [ 753.642170] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 753.642515] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] result = function(*args, **kwargs) [ 753.642515] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 753.642515] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] return func(*args, **kwargs) [ 753.642515] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 753.642515] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] raise e [ 753.642515] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 753.642515] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] nwinfo = self.network_api.allocate_for_instance( [ 753.642515] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 753.642515] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] created_port_ids = self._update_ports_for_instance( [ 753.642515] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 753.642515] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] with excutils.save_and_reraise_exception(): [ 753.642515] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 753.642515] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] self.force_reraise() [ 753.642898] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 753.642898] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] raise self.value [ 753.642898] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 753.642898] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] updated_port = self._update_port( [ 753.642898] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 753.642898] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] _ensure_no_port_binding_failure(port) [ 753.642898] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 753.642898] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] raise exception.PortBindingFailed(port_id=port['id']) [ 753.642898] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] nova.exception.PortBindingFailed: Binding failed for port ba41efcb-e78a-4f3e-a135-67f3325e12a4, please check neutron logs for more information. [ 753.642898] env[59620]: ERROR nova.compute.manager [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] [ 753.643461] env[59620]: DEBUG nova.compute.utils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Binding failed for port ba41efcb-e78a-4f3e-a135-67f3325e12a4, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 753.645188] env[59620]: DEBUG nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Build of instance fea7d2f4-199d-4c76-84cd-4ee7820990ec was re-scheduled: Binding failed for port ba41efcb-e78a-4f3e-a135-67f3325e12a4, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 753.645609] env[59620]: DEBUG nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 753.646029] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquiring lock "refresh_cache-fea7d2f4-199d-4c76-84cd-4ee7820990ec" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 753.646170] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Acquired lock "refresh_cache-fea7d2f4-199d-4c76-84cd-4ee7820990ec" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 753.646378] env[59620]: DEBUG nova.network.neutron [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 753.749523] env[59620]: DEBUG nova.network.neutron [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 753.767124] env[59620]: DEBUG nova.network.neutron [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 753.776529] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Releasing lock "refresh_cache-671639d6-3103-4eeb-86d3-b858a3919396" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 753.777858] env[59620]: DEBUG nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 753.778107] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 753.778760] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-20a30c50-8e20-49bf-938e-6d3bea70ebc0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.787159] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 753.790370] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21731b2f-1a8d-47ac-971b-123ac1a77118 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.809705] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Releasing lock "refresh_cache-5c21177e-6cff-414f-bff1-bac166929cab" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 753.810245] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 753.810488] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 753.811162] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-20012dfe-2d1d-4761-b0e6-087694313fdf {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.825780] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 671639d6-3103-4eeb-86d3-b858a3919396 could not be found. [ 753.826052] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 753.826269] env[59620]: INFO nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Took 0.05 seconds to destroy the instance on the hypervisor. [ 753.826536] env[59620]: DEBUG oslo.service.loopingcall [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 753.828359] env[59620]: DEBUG nova.compute.manager [-] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 753.828438] env[59620]: DEBUG nova.network.neutron [-] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 753.836449] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-582365aa-9d55-484a-9eef-24fb7f7c1138 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.855867] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5c21177e-6cff-414f-bff1-bac166929cab could not be found. [ 753.856858] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 753.857125] env[59620]: INFO nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Took 0.05 seconds to destroy the instance on the hypervisor. [ 753.861153] env[59620]: DEBUG oslo.service.loopingcall [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 753.861153] env[59620]: DEBUG nova.compute.manager [-] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 753.861283] env[59620]: DEBUG nova.network.neutron [-] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 753.933144] env[59620]: DEBUG nova.network.neutron [-] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 753.943586] env[59620]: DEBUG nova.network.neutron [-] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 753.949941] env[59620]: DEBUG nova.network.neutron [-] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 753.956215] env[59620]: INFO nova.compute.manager [-] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Took 0.13 seconds to deallocate network for instance. [ 753.957874] env[59620]: ERROR nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 76a7ae5a-7d40-42c4-936d-36e078fb7820, please check neutron logs for more information. [ 753.957874] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 753.957874] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 753.957874] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 753.957874] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 753.957874] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 753.957874] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 753.957874] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 753.957874] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 753.957874] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 753.957874] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 753.957874] env[59620]: ERROR nova.compute.manager raise self.value [ 753.957874] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 753.957874] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 753.957874] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 753.957874] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 753.958320] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 753.958320] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 753.958320] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 76a7ae5a-7d40-42c4-936d-36e078fb7820, please check neutron logs for more information. [ 753.958320] env[59620]: ERROR nova.compute.manager [ 753.958320] env[59620]: Traceback (most recent call last): [ 753.958320] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 753.958320] env[59620]: listener.cb(fileno) [ 753.958320] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 753.958320] env[59620]: result = function(*args, **kwargs) [ 753.958320] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 753.958320] env[59620]: return func(*args, **kwargs) [ 753.958320] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 753.958320] env[59620]: raise e [ 753.958320] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 753.958320] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 753.958320] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 753.958320] env[59620]: created_port_ids = self._update_ports_for_instance( [ 753.958320] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 753.958320] env[59620]: with excutils.save_and_reraise_exception(): [ 753.958320] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 753.958320] env[59620]: self.force_reraise() [ 753.958320] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 753.958320] env[59620]: raise self.value [ 753.958320] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 753.958320] env[59620]: updated_port = self._update_port( [ 753.958320] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 753.958320] env[59620]: _ensure_no_port_binding_failure(port) [ 753.958320] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 753.958320] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 753.959381] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 76a7ae5a-7d40-42c4-936d-36e078fb7820, please check neutron logs for more information. [ 753.959381] env[59620]: Removing descriptor: 21 [ 753.960048] env[59620]: DEBUG nova.network.neutron [-] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 753.961730] env[59620]: ERROR nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 76a7ae5a-7d40-42c4-936d-36e078fb7820, please check neutron logs for more information. [ 753.961730] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Traceback (most recent call last): [ 753.961730] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 753.961730] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] yield resources [ 753.961730] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 753.961730] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] self.driver.spawn(context, instance, image_meta, [ 753.961730] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 753.961730] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 753.961730] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 753.961730] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] vm_ref = self.build_virtual_machine(instance, [ 753.961730] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 753.962133] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] vif_infos = vmwarevif.get_vif_info(self._session, [ 753.962133] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 753.962133] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] for vif in network_info: [ 753.962133] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 753.962133] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] return self._sync_wrapper(fn, *args, **kwargs) [ 753.962133] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 753.962133] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] self.wait() [ 753.962133] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 753.962133] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] self[:] = self._gt.wait() [ 753.962133] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 753.962133] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] return self._exit_event.wait() [ 753.962133] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 753.962133] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] result = hub.switch() [ 753.962133] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 753.962639] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] return self.greenlet.switch() [ 753.962639] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 753.962639] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] result = function(*args, **kwargs) [ 753.962639] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 753.962639] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] return func(*args, **kwargs) [ 753.962639] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 753.962639] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] raise e [ 753.962639] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 753.962639] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] nwinfo = self.network_api.allocate_for_instance( [ 753.962639] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 753.962639] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] created_port_ids = self._update_ports_for_instance( [ 753.962639] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 753.962639] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] with excutils.save_and_reraise_exception(): [ 753.963248] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 753.963248] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] self.force_reraise() [ 753.963248] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 753.963248] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] raise self.value [ 753.963248] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 753.963248] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] updated_port = self._update_port( [ 753.963248] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 753.963248] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] _ensure_no_port_binding_failure(port) [ 753.963248] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 753.963248] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] raise exception.PortBindingFailed(port_id=port['id']) [ 753.963248] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] nova.exception.PortBindingFailed: Binding failed for port 76a7ae5a-7d40-42c4-936d-36e078fb7820, please check neutron logs for more information. [ 753.963248] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] [ 753.963899] env[59620]: INFO nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Terminating instance [ 753.965801] env[59620]: DEBUG nova.compute.claims [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 753.965972] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 753.966228] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 753.969189] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Acquiring lock "refresh_cache-d26dfe85-1a71-48e1-b462-f26f1327a9e7" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 753.969347] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Acquired lock "refresh_cache-d26dfe85-1a71-48e1-b462-f26f1327a9e7" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 753.969505] env[59620]: DEBUG nova.network.neutron [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 753.975906] env[59620]: INFO nova.compute.manager [-] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Took 0.11 seconds to deallocate network for instance. [ 753.976648] env[59620]: DEBUG nova.compute.claims [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 753.976987] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 754.035731] env[59620]: DEBUG nova.network.neutron [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 754.141355] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98e1e6e3-4fe5-4462-aa2a-aa8b7ec5f403 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.151757] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c4f8e9c-c4c7-4489-9e81-d279300fdc3b {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.190146] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec550a11-e256-4743-9fbe-db8a6ffde5a7 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.201644] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cca162a5-8693-4b4a-839a-76ee31cb320e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.213890] env[59620]: DEBUG nova.compute.provider_tree [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 754.227886] env[59620]: DEBUG nova.scheduler.client.report [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 754.253427] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.287s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 754.254322] env[59620]: ERROR nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port a4475db3-1cfd-47df-9fd1-b9a0d4fa35a1, please check neutron logs for more information. [ 754.254322] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Traceback (most recent call last): [ 754.254322] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 754.254322] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] self.driver.spawn(context, instance, image_meta, [ 754.254322] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 754.254322] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] self._vmops.spawn(context, instance, image_meta, injected_files, [ 754.254322] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 754.254322] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] vm_ref = self.build_virtual_machine(instance, [ 754.254322] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 754.254322] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] vif_infos = vmwarevif.get_vif_info(self._session, [ 754.254322] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 754.255027] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] for vif in network_info: [ 754.255027] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 754.255027] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] return self._sync_wrapper(fn, *args, **kwargs) [ 754.255027] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 754.255027] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] self.wait() [ 754.255027] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 754.255027] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] self[:] = self._gt.wait() [ 754.255027] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 754.255027] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] return self._exit_event.wait() [ 754.255027] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 754.255027] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] result = hub.switch() [ 754.255027] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 754.255027] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] return self.greenlet.switch() [ 754.255027] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 754.255423] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] result = function(*args, **kwargs) [ 754.255423] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 754.255423] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] return func(*args, **kwargs) [ 754.255423] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 754.255423] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] raise e [ 754.255423] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 754.255423] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] nwinfo = self.network_api.allocate_for_instance( [ 754.255423] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 754.255423] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] created_port_ids = self._update_ports_for_instance( [ 754.255423] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 754.255423] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] with excutils.save_and_reraise_exception(): [ 754.255423] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 754.255423] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] self.force_reraise() [ 754.255810] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 754.255810] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] raise self.value [ 754.255810] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 754.255810] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] updated_port = self._update_port( [ 754.255810] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 754.255810] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] _ensure_no_port_binding_failure(port) [ 754.255810] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 754.255810] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] raise exception.PortBindingFailed(port_id=port['id']) [ 754.255810] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] nova.exception.PortBindingFailed: Binding failed for port a4475db3-1cfd-47df-9fd1-b9a0d4fa35a1, please check neutron logs for more information. [ 754.255810] env[59620]: ERROR nova.compute.manager [instance: 671639d6-3103-4eeb-86d3-b858a3919396] [ 754.255810] env[59620]: DEBUG nova.compute.utils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Binding failed for port a4475db3-1cfd-47df-9fd1-b9a0d4fa35a1, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 754.256148] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.279s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 754.264019] env[59620]: DEBUG nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Build of instance 671639d6-3103-4eeb-86d3-b858a3919396 was re-scheduled: Binding failed for port a4475db3-1cfd-47df-9fd1-b9a0d4fa35a1, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 754.264019] env[59620]: DEBUG nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 754.264019] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Acquiring lock "refresh_cache-671639d6-3103-4eeb-86d3-b858a3919396" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 754.264019] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Acquired lock "refresh_cache-671639d6-3103-4eeb-86d3-b858a3919396" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 754.264288] env[59620]: DEBUG nova.network.neutron [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 754.428222] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-833050b9-f7a6-4c56-8c97-725db17e2e0e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.435959] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c116d32e-dd18-4b92-8eda-c4e2d21a3177 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.475545] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6de01ca5-512e-42fc-a288-0b9a7e2c9161 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.480805] env[59620]: DEBUG nova.network.neutron [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 754.484715] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0a9c712-08f4-486e-b303-9850871e6ece {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.491217] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Releasing lock "refresh_cache-fea7d2f4-199d-4c76-84cd-4ee7820990ec" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 754.491217] env[59620]: DEBUG nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 754.491412] env[59620]: DEBUG nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 754.491473] env[59620]: DEBUG nova.network.neutron [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 754.502828] env[59620]: DEBUG nova.compute.provider_tree [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 754.513878] env[59620]: DEBUG nova.scheduler.client.report [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 754.521075] env[59620]: DEBUG nova.network.neutron [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 754.535933] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.280s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 754.536601] env[59620]: ERROR nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 936854f8-ea3b-4b34-a17f-3a77b5316ae0, please check neutron logs for more information. [ 754.536601] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Traceback (most recent call last): [ 754.536601] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 754.536601] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] self.driver.spawn(context, instance, image_meta, [ 754.536601] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 754.536601] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 754.536601] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 754.536601] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] vm_ref = self.build_virtual_machine(instance, [ 754.536601] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 754.536601] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] vif_infos = vmwarevif.get_vif_info(self._session, [ 754.536601] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 754.536922] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] for vif in network_info: [ 754.536922] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 754.536922] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] return self._sync_wrapper(fn, *args, **kwargs) [ 754.536922] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 754.536922] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] self.wait() [ 754.536922] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 754.536922] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] self[:] = self._gt.wait() [ 754.536922] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 754.536922] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] return self._exit_event.wait() [ 754.536922] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 754.536922] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] result = hub.switch() [ 754.536922] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 754.536922] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] return self.greenlet.switch() [ 754.536922] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 754.537336] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] result = function(*args, **kwargs) [ 754.537336] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 754.537336] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] return func(*args, **kwargs) [ 754.537336] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 754.537336] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] raise e [ 754.537336] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 754.537336] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] nwinfo = self.network_api.allocate_for_instance( [ 754.537336] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 754.537336] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] created_port_ids = self._update_ports_for_instance( [ 754.537336] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 754.537336] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] with excutils.save_and_reraise_exception(): [ 754.537336] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 754.537336] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] self.force_reraise() [ 754.537705] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 754.537705] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] raise self.value [ 754.537705] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 754.537705] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] updated_port = self._update_port( [ 754.537705] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 754.537705] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] _ensure_no_port_binding_failure(port) [ 754.537705] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 754.537705] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] raise exception.PortBindingFailed(port_id=port['id']) [ 754.537705] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] nova.exception.PortBindingFailed: Binding failed for port 936854f8-ea3b-4b34-a17f-3a77b5316ae0, please check neutron logs for more information. [ 754.537705] env[59620]: ERROR nova.compute.manager [instance: 5c21177e-6cff-414f-bff1-bac166929cab] [ 754.537984] env[59620]: DEBUG nova.compute.utils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Binding failed for port 936854f8-ea3b-4b34-a17f-3a77b5316ae0, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 754.539653] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Build of instance 5c21177e-6cff-414f-bff1-bac166929cab was re-scheduled: Binding failed for port 936854f8-ea3b-4b34-a17f-3a77b5316ae0, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 754.539653] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 754.539653] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "refresh_cache-5c21177e-6cff-414f-bff1-bac166929cab" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 754.539653] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquired lock "refresh_cache-5c21177e-6cff-414f-bff1-bac166929cab" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 754.540193] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 754.561027] env[59620]: DEBUG nova.network.neutron [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 754.569012] env[59620]: DEBUG nova.network.neutron [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 754.580027] env[59620]: INFO nova.compute.manager [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] [instance: fea7d2f4-199d-4c76-84cd-4ee7820990ec] Took 0.09 seconds to deallocate network for instance. [ 754.615917] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 754.668105] env[59620]: DEBUG nova.network.neutron [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 754.680446] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Releasing lock "refresh_cache-d26dfe85-1a71-48e1-b462-f26f1327a9e7" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 754.680446] env[59620]: DEBUG nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 754.680603] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 754.681124] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-20020ce5-35f8-467d-b727-3e83b92ee2b3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.691807] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1849de6-73d1-4a81-92e7-062ae2dca900 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.703210] env[59620]: INFO nova.scheduler.client.report [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Deleted allocations for instance fea7d2f4-199d-4c76-84cd-4ee7820990ec [ 754.721751] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d26dfe85-1a71-48e1-b462-f26f1327a9e7 could not be found. [ 754.721979] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 754.722179] env[59620]: INFO nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Took 0.04 seconds to destroy the instance on the hypervisor. [ 754.722400] env[59620]: DEBUG oslo.service.loopingcall [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 754.722627] env[59620]: DEBUG nova.compute.manager [-] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 754.722718] env[59620]: DEBUG nova.network.neutron [-] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 754.727267] env[59620]: DEBUG oslo_concurrency.lockutils [None req-bb5b51aa-0873-4476-a278-47a80b942d70 tempest-DeleteServersTestJSON-420469719 tempest-DeleteServersTestJSON-420469719-project-member] Lock "fea7d2f4-199d-4c76-84cd-4ee7820990ec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.009s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 754.778524] env[59620]: DEBUG nova.network.neutron [-] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 754.794059] env[59620]: DEBUG nova.network.neutron [-] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 754.805291] env[59620]: INFO nova.compute.manager [-] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Took 0.08 seconds to deallocate network for instance. [ 754.808057] env[59620]: DEBUG nova.compute.claims [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 754.808323] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 754.808497] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 754.976644] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-464d3e00-9856-46ed-83d1-7d803efa9b66 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.984318] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-027d75cf-84c1-4f21-b691-164919b2ba7d {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.014640] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.018807] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c37099bd-860c-4317-a062-44895ebf489a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.025710] env[59620]: DEBUG nova.network.neutron [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.027893] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb866171-58c4-44f6-b206-8003759dade6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.032940] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Releasing lock "refresh_cache-5c21177e-6cff-414f-bff1-bac166929cab" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 755.034360] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 755.034609] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 755.034823] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 755.048415] env[59620]: DEBUG nova.compute.provider_tree [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 755.053024] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Releasing lock "refresh_cache-671639d6-3103-4eeb-86d3-b858a3919396" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 755.053024] env[59620]: DEBUG nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 755.053024] env[59620]: DEBUG nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 755.053024] env[59620]: DEBUG nova.network.neutron [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 755.055928] env[59620]: DEBUG nova.scheduler.client.report [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 755.069478] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.261s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 755.070085] env[59620]: ERROR nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 76a7ae5a-7d40-42c4-936d-36e078fb7820, please check neutron logs for more information. [ 755.070085] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Traceback (most recent call last): [ 755.070085] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 755.070085] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] self.driver.spawn(context, instance, image_meta, [ 755.070085] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 755.070085] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 755.070085] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 755.070085] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] vm_ref = self.build_virtual_machine(instance, [ 755.070085] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 755.070085] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] vif_infos = vmwarevif.get_vif_info(self._session, [ 755.070085] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 755.070487] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] for vif in network_info: [ 755.070487] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 755.070487] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] return self._sync_wrapper(fn, *args, **kwargs) [ 755.070487] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 755.070487] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] self.wait() [ 755.070487] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 755.070487] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] self[:] = self._gt.wait() [ 755.070487] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 755.070487] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] return self._exit_event.wait() [ 755.070487] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 755.070487] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] result = hub.switch() [ 755.070487] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 755.070487] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] return self.greenlet.switch() [ 755.070487] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 755.071123] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] result = function(*args, **kwargs) [ 755.071123] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 755.071123] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] return func(*args, **kwargs) [ 755.071123] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 755.071123] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] raise e [ 755.071123] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 755.071123] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] nwinfo = self.network_api.allocate_for_instance( [ 755.071123] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.071123] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] created_port_ids = self._update_ports_for_instance( [ 755.071123] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.071123] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] with excutils.save_and_reraise_exception(): [ 755.071123] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.071123] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] self.force_reraise() [ 755.071512] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.071512] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] raise self.value [ 755.071512] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.071512] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] updated_port = self._update_port( [ 755.071512] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.071512] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] _ensure_no_port_binding_failure(port) [ 755.071512] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.071512] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] raise exception.PortBindingFailed(port_id=port['id']) [ 755.071512] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] nova.exception.PortBindingFailed: Binding failed for port 76a7ae5a-7d40-42c4-936d-36e078fb7820, please check neutron logs for more information. [ 755.071512] env[59620]: ERROR nova.compute.manager [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] [ 755.071958] env[59620]: DEBUG nova.compute.utils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Binding failed for port 76a7ae5a-7d40-42c4-936d-36e078fb7820, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 755.072204] env[59620]: DEBUG nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Build of instance d26dfe85-1a71-48e1-b462-f26f1327a9e7 was re-scheduled: Binding failed for port 76a7ae5a-7d40-42c4-936d-36e078fb7820, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 755.072617] env[59620]: DEBUG nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 755.072832] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Acquiring lock "refresh_cache-d26dfe85-1a71-48e1-b462-f26f1327a9e7" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 755.072978] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Acquired lock "refresh_cache-d26dfe85-1a71-48e1-b462-f26f1327a9e7" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 755.073169] env[59620]: DEBUG nova.network.neutron [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 755.095742] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.102640] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.109686] env[59620]: INFO nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 5c21177e-6cff-414f-bff1-bac166929cab] Took 0.07 seconds to deallocate network for instance. [ 755.121848] env[59620]: DEBUG nova.network.neutron [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.127981] env[59620]: DEBUG nova.network.neutron [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.145882] env[59620]: INFO nova.compute.manager [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] [instance: 671639d6-3103-4eeb-86d3-b858a3919396] Took 0.10 seconds to deallocate network for instance. [ 755.152117] env[59620]: DEBUG nova.network.neutron [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.219214] env[59620]: INFO nova.scheduler.client.report [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Deleted allocations for instance 5c21177e-6cff-414f-bff1-bac166929cab [ 755.240481] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "5c21177e-6cff-414f-bff1-bac166929cab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.500s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 755.251226] env[59620]: INFO nova.scheduler.client.report [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Deleted allocations for instance 671639d6-3103-4eeb-86d3-b858a3919396 [ 755.266894] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e1f1c374-8b2f-40e8-88ee-003f665652b3 tempest-ServerAddressesTestJSON-1955780032 tempest-ServerAddressesTestJSON-1955780032-project-member] Lock "671639d6-3103-4eeb-86d3-b858a3919396" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 14.590s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 755.321328] env[59620]: ERROR nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 41411f7c-9337-45a9-9fd8-f1fa18d4a0eb, please check neutron logs for more information. [ 755.321328] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 755.321328] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 755.321328] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 755.321328] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.321328] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 755.321328] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.321328] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 755.321328] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.321328] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 755.321328] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.321328] env[59620]: ERROR nova.compute.manager raise self.value [ 755.321328] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.321328] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 755.321328] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.321328] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 755.322182] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.322182] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 755.322182] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 41411f7c-9337-45a9-9fd8-f1fa18d4a0eb, please check neutron logs for more information. [ 755.322182] env[59620]: ERROR nova.compute.manager [ 755.322182] env[59620]: Traceback (most recent call last): [ 755.322182] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 755.322182] env[59620]: listener.cb(fileno) [ 755.322182] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 755.322182] env[59620]: result = function(*args, **kwargs) [ 755.322182] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 755.322182] env[59620]: return func(*args, **kwargs) [ 755.322182] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 755.322182] env[59620]: raise e [ 755.322182] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 755.322182] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 755.322182] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.322182] env[59620]: created_port_ids = self._update_ports_for_instance( [ 755.322182] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.322182] env[59620]: with excutils.save_and_reraise_exception(): [ 755.322182] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.322182] env[59620]: self.force_reraise() [ 755.322182] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.322182] env[59620]: raise self.value [ 755.322182] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.322182] env[59620]: updated_port = self._update_port( [ 755.322182] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.322182] env[59620]: _ensure_no_port_binding_failure(port) [ 755.322182] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.322182] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 755.322897] env[59620]: nova.exception.PortBindingFailed: Binding failed for port 41411f7c-9337-45a9-9fd8-f1fa18d4a0eb, please check neutron logs for more information. [ 755.322897] env[59620]: Removing descriptor: 16 [ 755.322897] env[59620]: ERROR nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 41411f7c-9337-45a9-9fd8-f1fa18d4a0eb, please check neutron logs for more information. [ 755.322897] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Traceback (most recent call last): [ 755.322897] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 755.322897] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] yield resources [ 755.322897] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 755.322897] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] self.driver.spawn(context, instance, image_meta, [ 755.322897] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 755.322897] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] self._vmops.spawn(context, instance, image_meta, injected_files, [ 755.322897] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 755.322897] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] vm_ref = self.build_virtual_machine(instance, [ 755.323206] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 755.323206] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] vif_infos = vmwarevif.get_vif_info(self._session, [ 755.323206] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 755.323206] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] for vif in network_info: [ 755.323206] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 755.323206] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] return self._sync_wrapper(fn, *args, **kwargs) [ 755.323206] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 755.323206] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] self.wait() [ 755.323206] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 755.323206] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] self[:] = self._gt.wait() [ 755.323206] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 755.323206] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] return self._exit_event.wait() [ 755.323206] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 755.323206] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] result = hub.switch() [ 755.323542] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 755.323542] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] return self.greenlet.switch() [ 755.323542] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 755.323542] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] result = function(*args, **kwargs) [ 755.323542] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 755.323542] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] return func(*args, **kwargs) [ 755.323542] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 755.323542] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] raise e [ 755.323542] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 755.323542] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] nwinfo = self.network_api.allocate_for_instance( [ 755.323542] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.323542] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] created_port_ids = self._update_ports_for_instance( [ 755.323542] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.323863] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] with excutils.save_and_reraise_exception(): [ 755.323863] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.323863] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] self.force_reraise() [ 755.323863] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.323863] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] raise self.value [ 755.323863] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.323863] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] updated_port = self._update_port( [ 755.323863] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.323863] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] _ensure_no_port_binding_failure(port) [ 755.323863] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.323863] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] raise exception.PortBindingFailed(port_id=port['id']) [ 755.323863] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] nova.exception.PortBindingFailed: Binding failed for port 41411f7c-9337-45a9-9fd8-f1fa18d4a0eb, please check neutron logs for more information. [ 755.323863] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] [ 755.324237] env[59620]: INFO nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Terminating instance [ 755.326868] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "refresh_cache-f61f8046-f2ee-4de3-9c45-de52c2849399" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 755.327035] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquired lock "refresh_cache-f61f8046-f2ee-4de3-9c45-de52c2849399" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 755.327992] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 755.404841] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.455590] env[59620]: ERROR nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port adf341d9-a3d3-4ce9-97cf-6c6cfb94962d, please check neutron logs for more information. [ 755.455590] env[59620]: ERROR nova.compute.manager Traceback (most recent call last): [ 755.455590] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 755.455590] env[59620]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 755.455590] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.455590] env[59620]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 755.455590] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.455590] env[59620]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 755.455590] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.455590] env[59620]: ERROR nova.compute.manager self.force_reraise() [ 755.455590] env[59620]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.455590] env[59620]: ERROR nova.compute.manager raise self.value [ 755.455590] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.455590] env[59620]: ERROR nova.compute.manager updated_port = self._update_port( [ 755.455590] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.455590] env[59620]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 755.456538] env[59620]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.456538] env[59620]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 755.456538] env[59620]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port adf341d9-a3d3-4ce9-97cf-6c6cfb94962d, please check neutron logs for more information. [ 755.456538] env[59620]: ERROR nova.compute.manager [ 755.456538] env[59620]: Traceback (most recent call last): [ 755.456538] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 755.456538] env[59620]: listener.cb(fileno) [ 755.456538] env[59620]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 755.456538] env[59620]: result = function(*args, **kwargs) [ 755.456538] env[59620]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 755.456538] env[59620]: return func(*args, **kwargs) [ 755.456538] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 755.456538] env[59620]: raise e [ 755.456538] env[59620]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 755.456538] env[59620]: nwinfo = self.network_api.allocate_for_instance( [ 755.456538] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.456538] env[59620]: created_port_ids = self._update_ports_for_instance( [ 755.456538] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.456538] env[59620]: with excutils.save_and_reraise_exception(): [ 755.456538] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.456538] env[59620]: self.force_reraise() [ 755.456538] env[59620]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.456538] env[59620]: raise self.value [ 755.456538] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.456538] env[59620]: updated_port = self._update_port( [ 755.456538] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.456538] env[59620]: _ensure_no_port_binding_failure(port) [ 755.456538] env[59620]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.456538] env[59620]: raise exception.PortBindingFailed(port_id=port['id']) [ 755.457541] env[59620]: nova.exception.PortBindingFailed: Binding failed for port adf341d9-a3d3-4ce9-97cf-6c6cfb94962d, please check neutron logs for more information. [ 755.457541] env[59620]: Removing descriptor: 23 [ 755.457541] env[59620]: ERROR nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port adf341d9-a3d3-4ce9-97cf-6c6cfb94962d, please check neutron logs for more information. [ 755.457541] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Traceback (most recent call last): [ 755.457541] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 755.457541] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] yield resources [ 755.457541] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 755.457541] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] self.driver.spawn(context, instance, image_meta, [ 755.457541] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 755.457541] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] self._vmops.spawn(context, instance, image_meta, injected_files, [ 755.457541] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 755.457541] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] vm_ref = self.build_virtual_machine(instance, [ 755.457967] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 755.457967] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] vif_infos = vmwarevif.get_vif_info(self._session, [ 755.457967] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 755.457967] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] for vif in network_info: [ 755.457967] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 755.457967] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] return self._sync_wrapper(fn, *args, **kwargs) [ 755.457967] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 755.457967] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] self.wait() [ 755.457967] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 755.457967] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] self[:] = self._gt.wait() [ 755.457967] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 755.457967] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] return self._exit_event.wait() [ 755.457967] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 755.457967] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] result = hub.switch() [ 755.458439] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 755.458439] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] return self.greenlet.switch() [ 755.458439] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 755.458439] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] result = function(*args, **kwargs) [ 755.458439] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 755.458439] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] return func(*args, **kwargs) [ 755.458439] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 755.458439] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] raise e [ 755.458439] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 755.458439] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] nwinfo = self.network_api.allocate_for_instance( [ 755.458439] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 755.458439] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] created_port_ids = self._update_ports_for_instance( [ 755.458439] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 755.458870] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] with excutils.save_and_reraise_exception(): [ 755.458870] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 755.458870] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] self.force_reraise() [ 755.458870] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 755.458870] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] raise self.value [ 755.458870] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 755.458870] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] updated_port = self._update_port( [ 755.458870] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 755.458870] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] _ensure_no_port_binding_failure(port) [ 755.458870] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 755.458870] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] raise exception.PortBindingFailed(port_id=port['id']) [ 755.458870] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] nova.exception.PortBindingFailed: Binding failed for port adf341d9-a3d3-4ce9-97cf-6c6cfb94962d, please check neutron logs for more information. [ 755.458870] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] [ 755.461935] env[59620]: INFO nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Terminating instance [ 755.461935] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "refresh_cache-7c8fc3b5-2d31-498a-ac7c-a8bffe415d70" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 755.461935] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquired lock "refresh_cache-7c8fc3b5-2d31-498a-ac7c-a8bffe415d70" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 755.461935] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 755.748641] env[59620]: DEBUG nova.network.neutron [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.760152] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.767672] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Releasing lock "refresh_cache-d26dfe85-1a71-48e1-b462-f26f1327a9e7" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 755.767672] env[59620]: DEBUG nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 755.767672] env[59620]: DEBUG nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 755.767672] env[59620]: DEBUG nova.network.neutron [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 755.825693] env[59620]: DEBUG nova.network.neutron [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.835017] env[59620]: DEBUG nova.network.neutron [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.847788] env[59620]: INFO nova.compute.manager [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] [instance: d26dfe85-1a71-48e1-b462-f26f1327a9e7] Took 0.08 seconds to deallocate network for instance. [ 755.944258] env[59620]: INFO nova.scheduler.client.report [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Deleted allocations for instance d26dfe85-1a71-48e1-b462-f26f1327a9e7 [ 755.961017] env[59620]: DEBUG oslo_concurrency.lockutils [None req-4306ba4c-88fe-46bb-80ff-160802cb8882 tempest-ServerRescueNegativeTestJSON-1950675257 tempest-ServerRescueNegativeTestJSON-1950675257-project-member] Lock "d26dfe85-1a71-48e1-b462-f26f1327a9e7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.988s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.045599] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 756.065172] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Releasing lock "refresh_cache-f61f8046-f2ee-4de3-9c45-de52c2849399" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 756.065172] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 756.065172] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 756.065172] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d02c9366-c1f4-4438-9e68-69b1be524bc3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.074029] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2de99713-0bf4-49c6-b7cb-bd00271b0375 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.098151] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f61f8046-f2ee-4de3-9c45-de52c2849399 could not be found. [ 756.098596] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 756.098596] env[59620]: INFO nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Took 0.04 seconds to destroy the instance on the hypervisor. [ 756.099032] env[59620]: DEBUG oslo.service.loopingcall [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 756.099178] env[59620]: DEBUG nova.compute.manager [-] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 756.099300] env[59620]: DEBUG nova.network.neutron [-] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 756.130236] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 756.135743] env[59620]: DEBUG nova.network.neutron [-] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 756.148631] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Releasing lock "refresh_cache-7c8fc3b5-2d31-498a-ac7c-a8bffe415d70" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 756.149155] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 756.149354] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 756.149632] env[59620]: DEBUG nova.network.neutron [-] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 756.153595] env[59620]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-85038727-a1d4-477e-b396-1d0f96bb0af0 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.160936] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d48cdcdb-c4cf-406e-a590-39eb86f7e07e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.172937] env[59620]: INFO nova.compute.manager [-] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Took 0.07 seconds to deallocate network for instance. [ 756.176043] env[59620]: DEBUG nova.compute.claims [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 756.176320] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.178382] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.186843] env[59620]: WARNING nova.virt.vmwareapi.vmops [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70 could not be found. [ 756.187062] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 756.187239] env[59620]: INFO nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Took 0.04 seconds to destroy the instance on the hypervisor. [ 756.190040] env[59620]: DEBUG oslo.service.loopingcall [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 756.190040] env[59620]: DEBUG nova.compute.manager [-] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 756.190040] env[59620]: DEBUG nova.network.neutron [-] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 756.235199] env[59620]: DEBUG nova.network.neutron [-] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 756.246919] env[59620]: DEBUG nova.network.neutron [-] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 756.264168] env[59620]: INFO nova.compute.manager [-] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Took 0.08 seconds to deallocate network for instance. [ 756.269314] env[59620]: DEBUG nova.compute.claims [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 756.269314] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.301538] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d28a7bda-23ba-4539-b5e9-22e3cdb2ac39 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.314540] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f94f8412-1c2a-4bd1-b4cb-14f2edc93a57 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.348687] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b165c8e-7b2e-4839-9eb8-a4fed514320a {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.357530] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ed73a68-bdb9-48d6-90c4-4ec3b12addb7 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.373281] env[59620]: DEBUG nova.compute.provider_tree [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 756.383317] env[59620]: DEBUG nova.scheduler.client.report [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 756.406199] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.229s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.406357] env[59620]: ERROR nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 41411f7c-9337-45a9-9fd8-f1fa18d4a0eb, please check neutron logs for more information. [ 756.406357] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Traceback (most recent call last): [ 756.406357] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 756.406357] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] self.driver.spawn(context, instance, image_meta, [ 756.406357] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 756.406357] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] self._vmops.spawn(context, instance, image_meta, injected_files, [ 756.406357] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 756.406357] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] vm_ref = self.build_virtual_machine(instance, [ 756.406357] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 756.406357] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] vif_infos = vmwarevif.get_vif_info(self._session, [ 756.406357] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 756.406662] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] for vif in network_info: [ 756.406662] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 756.406662] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] return self._sync_wrapper(fn, *args, **kwargs) [ 756.406662] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 756.406662] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] self.wait() [ 756.406662] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 756.406662] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] self[:] = self._gt.wait() [ 756.406662] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 756.406662] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] return self._exit_event.wait() [ 756.406662] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 756.406662] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] result = hub.switch() [ 756.406662] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 756.406662] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] return self.greenlet.switch() [ 756.406662] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 756.407011] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] result = function(*args, **kwargs) [ 756.407011] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 756.407011] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] return func(*args, **kwargs) [ 756.407011] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 756.407011] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] raise e [ 756.407011] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 756.407011] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] nwinfo = self.network_api.allocate_for_instance( [ 756.407011] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 756.407011] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] created_port_ids = self._update_ports_for_instance( [ 756.407011] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 756.407011] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] with excutils.save_and_reraise_exception(): [ 756.407011] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 756.407011] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] self.force_reraise() [ 756.407318] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 756.407318] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] raise self.value [ 756.407318] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 756.407318] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] updated_port = self._update_port( [ 756.407318] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 756.407318] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] _ensure_no_port_binding_failure(port) [ 756.407318] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 756.407318] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] raise exception.PortBindingFailed(port_id=port['id']) [ 756.407318] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] nova.exception.PortBindingFailed: Binding failed for port 41411f7c-9337-45a9-9fd8-f1fa18d4a0eb, please check neutron logs for more information. [ 756.407318] env[59620]: ERROR nova.compute.manager [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] [ 756.407564] env[59620]: DEBUG nova.compute.utils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Binding failed for port 41411f7c-9337-45a9-9fd8-f1fa18d4a0eb, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 756.408238] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.139s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.411693] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Build of instance f61f8046-f2ee-4de3-9c45-de52c2849399 was re-scheduled: Binding failed for port 41411f7c-9337-45a9-9fd8-f1fa18d4a0eb, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 756.412456] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 756.412456] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "refresh_cache-f61f8046-f2ee-4de3-9c45-de52c2849399" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 756.412571] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquired lock "refresh_cache-f61f8046-f2ee-4de3-9c45-de52c2849399" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 756.412635] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 756.497412] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 756.555099] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0da6b9e-66b8-4f56-ae78-248a219ffc43 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.566164] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d3b270b-b9f8-4c18-a7e0-d199ed0beea3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.598075] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e4c0823-83fd-4364-b942-7814be4f608e {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.605378] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05223eee-79c2-4c21-b04c-1dc3b9223892 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.621899] env[59620]: DEBUG nova.compute.provider_tree [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 756.631555] env[59620]: DEBUG nova.scheduler.client.report [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 756.655705] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.247s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.656552] env[59620]: ERROR nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port adf341d9-a3d3-4ce9-97cf-6c6cfb94962d, please check neutron logs for more information. [ 756.656552] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Traceback (most recent call last): [ 756.656552] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 756.656552] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] self.driver.spawn(context, instance, image_meta, [ 756.656552] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 756.656552] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] self._vmops.spawn(context, instance, image_meta, injected_files, [ 756.656552] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 756.656552] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] vm_ref = self.build_virtual_machine(instance, [ 756.656552] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 756.656552] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] vif_infos = vmwarevif.get_vif_info(self._session, [ 756.656552] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 756.656900] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] for vif in network_info: [ 756.656900] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 756.656900] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] return self._sync_wrapper(fn, *args, **kwargs) [ 756.656900] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 756.656900] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] self.wait() [ 756.656900] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 756.656900] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] self[:] = self._gt.wait() [ 756.656900] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 756.656900] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] return self._exit_event.wait() [ 756.656900] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 756.656900] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] result = hub.switch() [ 756.656900] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 756.656900] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] return self.greenlet.switch() [ 756.656900] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 756.657303] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] result = function(*args, **kwargs) [ 756.657303] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 756.657303] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] return func(*args, **kwargs) [ 756.657303] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 756.657303] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] raise e [ 756.657303] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 756.657303] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] nwinfo = self.network_api.allocate_for_instance( [ 756.657303] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 756.657303] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] created_port_ids = self._update_ports_for_instance( [ 756.657303] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 756.657303] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] with excutils.save_and_reraise_exception(): [ 756.657303] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 756.657303] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] self.force_reraise() [ 756.657627] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 756.657627] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] raise self.value [ 756.657627] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 756.657627] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] updated_port = self._update_port( [ 756.657627] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 756.657627] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] _ensure_no_port_binding_failure(port) [ 756.657627] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 756.657627] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] raise exception.PortBindingFailed(port_id=port['id']) [ 756.657627] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] nova.exception.PortBindingFailed: Binding failed for port adf341d9-a3d3-4ce9-97cf-6c6cfb94962d, please check neutron logs for more information. [ 756.657627] env[59620]: ERROR nova.compute.manager [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] [ 756.657906] env[59620]: DEBUG nova.compute.utils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Binding failed for port adf341d9-a3d3-4ce9-97cf-6c6cfb94962d, please check neutron logs for more information. {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 756.659160] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Build of instance 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70 was re-scheduled: Binding failed for port adf341d9-a3d3-4ce9-97cf-6c6cfb94962d, please check neutron logs for more information. {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 756.659651] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 756.659814] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquiring lock "refresh_cache-7c8fc3b5-2d31-498a-ac7c-a8bffe415d70" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 756.659914] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Acquired lock "refresh_cache-7c8fc3b5-2d31-498a-ac7c-a8bffe415d70" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 756.660082] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 756.725569] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.082563] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 757.097030] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Releasing lock "refresh_cache-f61f8046-f2ee-4de3-9c45-de52c2849399" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 757.097030] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 757.097030] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 757.097030] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 757.153610] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.161524] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 757.173727] env[59620]: INFO nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: f61f8046-f2ee-4de3-9c45-de52c2849399] Took 0.08 seconds to deallocate network for instance. [ 757.273092] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 757.286443] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Releasing lock "refresh_cache-7c8fc3b5-2d31-498a-ac7c-a8bffe415d70" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 757.286443] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 757.286443] env[59620]: DEBUG nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Deallocating network for instance {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 757.286443] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] deallocate_for_instance() {{(pid=59620) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 757.294837] env[59620]: INFO nova.scheduler.client.report [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Deleted allocations for instance f61f8046-f2ee-4de3-9c45-de52c2849399 [ 757.320042] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "f61f8046-f2ee-4de3-9c45-de52c2849399" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.698s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 757.347558] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.357091] env[59620]: DEBUG nova.network.neutron [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 757.366767] env[59620]: INFO nova.compute.manager [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] [instance: 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70] Took 0.08 seconds to deallocate network for instance. [ 757.482132] env[59620]: INFO nova.scheduler.client.report [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Deleted allocations for instance 7c8fc3b5-2d31-498a-ac7c-a8bffe415d70 [ 757.503040] env[59620]: DEBUG oslo_concurrency.lockutils [None req-a3f61a8f-df84-435b-a071-33213b53f37c tempest-ListServersNegativeTestJSON-667203106 tempest-ListServersNegativeTestJSON-667203106-project-member] Lock "7c8fc3b5-2d31-498a-ac7c-a8bffe415d70" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 13.811s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 788.852276] env[59620]: WARNING oslo_vmware.rw_handles [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 788.852276] env[59620]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 788.852276] env[59620]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 788.852276] env[59620]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 788.852276] env[59620]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 788.852276] env[59620]: ERROR oslo_vmware.rw_handles response.begin() [ 788.852276] env[59620]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 788.852276] env[59620]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 788.852276] env[59620]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 788.852276] env[59620]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 788.852276] env[59620]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 788.852276] env[59620]: ERROR oslo_vmware.rw_handles [ 788.852963] env[59620]: DEBUG nova.virt.vmwareapi.images [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Downloaded image file data 2efa4364-ba59-4de9-978f-169a769ee710 to vmware_temp/29544aff-9dd0-4af0-800c-bebbc10731fe/2efa4364-ba59-4de9-978f-169a769ee710/tmp-sparse.vmdk on the data store datastore1 {{(pid=59620) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 788.854326] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Caching image {{(pid=59620) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 788.854587] env[59620]: DEBUG nova.virt.vmwareapi.vm_util [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Copying Virtual Disk [datastore1] vmware_temp/29544aff-9dd0-4af0-800c-bebbc10731fe/2efa4364-ba59-4de9-978f-169a769ee710/tmp-sparse.vmdk to [datastore1] vmware_temp/29544aff-9dd0-4af0-800c-bebbc10731fe/2efa4364-ba59-4de9-978f-169a769ee710/2efa4364-ba59-4de9-978f-169a769ee710.vmdk {{(pid=59620) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 788.854878] env[59620]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ac402fd2-2c30-4a66-92f0-054b86910867 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.863772] env[59620]: DEBUG oslo_vmware.api [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Waiting for the task: (returnval){ [ 788.863772] env[59620]: value = "task-1308642" [ 788.863772] env[59620]: _type = "Task" [ 788.863772] env[59620]: } to complete. {{(pid=59620) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 788.871684] env[59620]: DEBUG oslo_vmware.api [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Task: {'id': task-1308642, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59620) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 789.384842] env[59620]: DEBUG oslo_vmware.exceptions [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Fault InvalidArgument not matched. {{(pid=59620) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 789.385299] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2efa4364-ba59-4de9-978f-169a769ee710/2efa4364-ba59-4de9-978f-169a769ee710.vmdk" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 789.386330] env[59620]: ERROR nova.compute.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 789.386330] env[59620]: Faults: ['InvalidArgument'] [ 789.386330] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Traceback (most recent call last): [ 789.386330] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 789.386330] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] yield resources [ 789.386330] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 789.386330] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] self.driver.spawn(context, instance, image_meta, [ 789.386330] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 789.386330] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 789.386330] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 789.386330] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] self._fetch_image_if_missing(context, vi) [ 789.386330] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 789.386935] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] image_cache(vi, tmp_image_ds_loc) [ 789.386935] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 789.386935] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] vm_util.copy_virtual_disk( [ 789.386935] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 789.386935] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] session._wait_for_task(vmdk_copy_task) [ 789.386935] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 789.386935] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] return self.wait_for_task(task_ref) [ 789.386935] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 789.386935] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] return evt.wait() [ 789.386935] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 789.386935] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] result = hub.switch() [ 789.386935] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 789.386935] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] return self.greenlet.switch() [ 789.389521] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 789.389521] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] self.f(*self.args, **self.kw) [ 789.389521] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 789.389521] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] raise exceptions.translate_fault(task_info.error) [ 789.389521] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 789.389521] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Faults: ['InvalidArgument'] [ 789.389521] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] [ 789.389521] env[59620]: INFO nova.compute.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Terminating instance [ 789.390497] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquiring lock "refresh_cache-50bbfdd5-bac5-4634-bc5d-c215a31889e9" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 789.390797] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquired lock "refresh_cache-50bbfdd5-bac5-4634-bc5d-c215a31889e9" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 789.391532] env[59620]: DEBUG nova.network.neutron [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 789.442585] env[59620]: DEBUG nova.network.neutron [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 789.537071] env[59620]: DEBUG nova.network.neutron [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 789.547975] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Releasing lock "refresh_cache-50bbfdd5-bac5-4634-bc5d-c215a31889e9" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 789.548430] env[59620]: DEBUG nova.compute.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Start destroying the instance on the hypervisor. {{(pid=59620) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 789.548616] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Destroying instance {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 789.549999] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df5c3fac-a6f4-475a-919d-08d63ebe791c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 789.559776] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Unregistering the VM {{(pid=59620) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 789.560082] env[59620]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9f470aee-7cf1-4f8c-9a65-157b79cc4043 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 789.589578] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Unregistered the VM {{(pid=59620) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 789.589899] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Deleting contents of the VM from datastore datastore1 {{(pid=59620) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 789.590111] env[59620]: DEBUG nova.virt.vmwareapi.ds_util [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Deleting the datastore file [datastore1] 50bbfdd5-bac5-4634-bc5d-c215a31889e9 {{(pid=59620) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 789.590385] env[59620]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7099d2fb-5b54-47ce-b326-42bda6ee63b8 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 789.600963] env[59620]: DEBUG oslo_vmware.api [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Waiting for the task: (returnval){ [ 789.600963] env[59620]: value = "task-1308644" [ 789.600963] env[59620]: _type = "Task" [ 789.600963] env[59620]: } to complete. {{(pid=59620) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 789.611880] env[59620]: DEBUG oslo_vmware.api [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Task: {'id': task-1308644, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59620) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 790.118739] env[59620]: DEBUG oslo_vmware.api [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Task: {'id': task-1308644, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.035089} completed successfully. {{(pid=59620) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 790.118991] env[59620]: DEBUG nova.virt.vmwareapi.ds_util [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Deleted the datastore file {{(pid=59620) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 790.119270] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Deleted contents of the VM from datastore datastore1 {{(pid=59620) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 790.119270] env[59620]: DEBUG nova.virt.vmwareapi.vmops [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Instance destroyed {{(pid=59620) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 790.119407] env[59620]: INFO nova.compute.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Took 0.57 seconds to destroy the instance on the hypervisor. [ 790.119907] env[59620]: DEBUG oslo.service.loopingcall [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59620) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 790.119975] env[59620]: DEBUG nova.compute.manager [-] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Skipping network deallocation for instance since networking was not requested. {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 790.122168] env[59620]: DEBUG nova.compute.claims [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Aborting claim: {{(pid=59620) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 790.122324] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 790.122572] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 790.223033] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2629136-6ab8-4c40-8fd9-75723f6252d3 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 790.230735] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5363f650-bcf4-485b-8602-547d903fa490 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 790.264021] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff8cb1bc-f04c-403b-a940-33385f9bc133 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 790.272734] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cb2bb3f-56bd-42bc-9fea-ef8e33ce4758 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 790.288395] env[59620]: DEBUG nova.compute.provider_tree [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 790.297686] env[59620]: DEBUG nova.scheduler.client.report [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 790.321328] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.199s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 790.321932] env[59620]: ERROR nova.compute.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 790.321932] env[59620]: Faults: ['InvalidArgument'] [ 790.321932] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Traceback (most recent call last): [ 790.321932] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 790.321932] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] self.driver.spawn(context, instance, image_meta, [ 790.321932] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 790.321932] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 790.321932] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 790.321932] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] self._fetch_image_if_missing(context, vi) [ 790.321932] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 790.321932] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] image_cache(vi, tmp_image_ds_loc) [ 790.321932] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 790.322342] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] vm_util.copy_virtual_disk( [ 790.322342] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 790.322342] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] session._wait_for_task(vmdk_copy_task) [ 790.322342] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 790.322342] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] return self.wait_for_task(task_ref) [ 790.322342] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 790.322342] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] return evt.wait() [ 790.322342] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 790.322342] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] result = hub.switch() [ 790.322342] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 790.322342] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] return self.greenlet.switch() [ 790.322342] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 790.322342] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] self.f(*self.args, **self.kw) [ 790.322769] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 790.322769] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] raise exceptions.translate_fault(task_info.error) [ 790.322769] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 790.322769] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Faults: ['InvalidArgument'] [ 790.322769] env[59620]: ERROR nova.compute.manager [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] [ 790.322769] env[59620]: DEBUG nova.compute.utils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] VimFaultException {{(pid=59620) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 790.327940] env[59620]: DEBUG nova.compute.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Build of instance 50bbfdd5-bac5-4634-bc5d-c215a31889e9 was re-scheduled: A specified parameter was not correct: fileType [ 790.327940] env[59620]: Faults: ['InvalidArgument'] {{(pid=59620) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 790.327940] env[59620]: DEBUG nova.compute.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Unplugging VIFs for instance {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 790.328145] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquiring lock "refresh_cache-50bbfdd5-bac5-4634-bc5d-c215a31889e9" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 790.328183] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Acquired lock "refresh_cache-50bbfdd5-bac5-4634-bc5d-c215a31889e9" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 790.328432] env[59620]: DEBUG nova.network.neutron [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Building network info cache for instance {{(pid=59620) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 790.515852] env[59620]: DEBUG nova.network.neutron [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Instance cache missing network info. {{(pid=59620) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 790.693978] env[59620]: DEBUG nova.network.neutron [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Updating instance_info_cache with network_info: [] {{(pid=59620) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 790.707358] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Releasing lock "refresh_cache-50bbfdd5-bac5-4634-bc5d-c215a31889e9" {{(pid=59620) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 790.707795] env[59620]: DEBUG nova.compute.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59620) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 790.708041] env[59620]: DEBUG nova.compute.manager [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] [instance: 50bbfdd5-bac5-4634-bc5d-c215a31889e9] Skipping network deallocation for instance since networking was not requested. {{(pid=59620) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 790.837282] env[59620]: INFO nova.scheduler.client.report [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Deleted allocations for instance 50bbfdd5-bac5-4634-bc5d-c215a31889e9 [ 790.861656] env[59620]: DEBUG oslo_concurrency.lockutils [None req-e71d4e24-c7de-4057-bd26-bc9d8b3440c6 tempest-ServersAaction247Test-1199114441 tempest-ServersAaction247Test-1199114441-project-member] Lock "50bbfdd5-bac5-4634-bc5d-c215a31889e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 53.975s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 798.311266] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 798.958592] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 798.958734] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Starting heal instance info cache {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 798.958856] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Rebuilding the list of instances to heal {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 798.966639] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Didn't find any instances for network info cache update. {{(pid=59620) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 799.961982] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 800.958867] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 800.959120] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 801.959955] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 801.960378] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 801.960378] env[59620]: DEBUG nova.compute.manager [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59620) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 802.959755] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.960077] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.960372] env[59620]: DEBUG oslo_service.periodic_task [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Running periodic task ComputeManager.update_available_resource {{(pid=59620) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.969697] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 802.969884] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 802.970054] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 802.970209] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59620) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 802.971279] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eca79c8-b343-4d19-8aa4-ed959c3d4749 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.980359] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a9a2eb8-16a2-4fc1-bb12-6e40e259b432 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.993998] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c52a3d8-bb3f-41b3-9e62-7a2ce7525ac6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.999966] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f52e0cd-4ef0-4421-a32c-0b255de4a7fc {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.027571] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181489MB free_disk=136GB free_vcpus=48 pci_devices=None {{(pid=59620) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 803.027706] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 803.027875] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 803.057136] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59620) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 803.057303] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59620) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 803.071021] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a2ac414-18a4-410c-a231-cebe1b202dbe {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.078140] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfe80ee5-7a90-403b-8529-a8b8750399a2 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.106974] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db350467-e94d-4140-ac64-68540adc531c {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.113482] env[59620]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3897343b-b836-4ed6-8f7d-68cbdc0758c6 {{(pid=59620) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.125923] env[59620]: DEBUG nova.compute.provider_tree [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Inventory has not changed in ProviderTree for provider: 40bba435-8384-412d-aa10-bdcf44760016 {{(pid=59620) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 803.133282] env[59620]: DEBUG nova.scheduler.client.report [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Inventory has not changed for provider 40bba435-8384-412d-aa10-bdcf44760016 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 136, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59620) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 803.147132] env[59620]: DEBUG nova.compute.resource_tracker [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59620) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 803.147294] env[59620]: DEBUG oslo_concurrency.lockutils [None req-fb20359e-82ae-4d53-9121-aa1b85f6ac5e None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.119s {{(pid=59620) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}}