[ 568.112811] env[65869]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 568.753540] env[65918]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 570.280381] env[65918]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=65918) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 570.280607] env[65918]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=65918) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 570.280864] env[65918]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=65918) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 570.281183] env[65918]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 570.282768] env[65918]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 570.403558] env[65918]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=65918) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 570.413429] env[65918]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=65918) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 570.526057] env[65918]: INFO nova.virt.driver [None req-659a7fec-d430-4eac-a2fd-e53383b7c578 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 570.611263] env[65918]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 570.611263] env[65918]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 570.611263] env[65918]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=65918) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 573.915217] env[65918]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-1ca358e9-134c-4dde-bfd5-597d64dd144b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.931031] env[65918]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=65918) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 573.931178] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-4fe8c2e0-c2e7-441c-8944-d468d5af01d4 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.955065] env[65918]: INFO oslo_vmware.api [-] Successfully established new session; session ID is eabb0. [ 573.955184] env[65918]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.348s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 573.955827] env[65918]: INFO nova.virt.vmwareapi.driver [None req-659a7fec-d430-4eac-a2fd-e53383b7c578 None None] VMware vCenter version: 7.0.3 [ 573.959264] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b10d846e-d3f1-4013-b1a2-bf9312059681 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.976399] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-904b6a04-0c60-427e-8bac-a000c8433fda {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.982050] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3557eaa2-f9a3-4092-a663-10dad991da2c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.988420] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e36db64-9d64-4577-b2a6-e696cb691703 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.002105] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5c698bc-705a-4284-b3f5-dd8fc90da5bc {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.007750] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07373028-2305-48c0-ae32-a64140fce66b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.037471] env[65918]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-1bacddb6-d6bb-49e0-b624-6357cc9eaf04 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.042860] env[65918]: DEBUG nova.virt.vmwareapi.driver [None req-659a7fec-d430-4eac-a2fd-e53383b7c578 None None] Extension org.openstack.compute already exists. {{(pid=65918) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 574.045594] env[65918]: INFO nova.compute.provider_config [None req-659a7fec-d430-4eac-a2fd-e53383b7c578 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 574.062468] env[65918]: DEBUG nova.context [None req-659a7fec-d430-4eac-a2fd-e53383b7c578 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),d5b0d2e9-df60-4e9f-a2f8-a90735ec843d(cell1) {{(pid=65918) load_cells /opt/stack/nova/nova/context.py:464}} [ 574.064389] env[65918]: DEBUG oslo_concurrency.lockutils [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 574.064614] env[65918]: DEBUG oslo_concurrency.lockutils [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 574.065346] env[65918]: DEBUG oslo_concurrency.lockutils [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 574.065694] env[65918]: DEBUG oslo_concurrency.lockutils [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] Acquiring lock "d5b0d2e9-df60-4e9f-a2f8-a90735ec843d" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 574.065880] env[65918]: DEBUG oslo_concurrency.lockutils [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] Lock "d5b0d2e9-df60-4e9f-a2f8-a90735ec843d" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 574.066843] env[65918]: DEBUG oslo_concurrency.lockutils [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] Lock "d5b0d2e9-df60-4e9f-a2f8-a90735ec843d" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 574.079247] env[65918]: DEBUG oslo_db.sqlalchemy.engines [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=65918) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 574.079645] env[65918]: DEBUG oslo_db.sqlalchemy.engines [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=65918) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 574.088097] env[65918]: ERROR nova.db.main.api [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 574.088097] env[65918]: result = function(*args, **kwargs) [ 574.088097] env[65918]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 574.088097] env[65918]: return func(*args, **kwargs) [ 574.088097] env[65918]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 574.088097] env[65918]: result = fn(*args, **kwargs) [ 574.088097] env[65918]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 574.088097] env[65918]: return f(*args, **kwargs) [ 574.088097] env[65918]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 574.088097] env[65918]: return db.service_get_minimum_version(context, binaries) [ 574.088097] env[65918]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 574.088097] env[65918]: _check_db_access() [ 574.088097] env[65918]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 574.088097] env[65918]: stacktrace = ''.join(traceback.format_stack()) [ 574.088097] env[65918]: [ 574.089071] env[65918]: ERROR nova.db.main.api [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 574.089071] env[65918]: result = function(*args, **kwargs) [ 574.089071] env[65918]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 574.089071] env[65918]: return func(*args, **kwargs) [ 574.089071] env[65918]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 574.089071] env[65918]: result = fn(*args, **kwargs) [ 574.089071] env[65918]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 574.089071] env[65918]: return f(*args, **kwargs) [ 574.089071] env[65918]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 574.089071] env[65918]: return db.service_get_minimum_version(context, binaries) [ 574.089071] env[65918]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 574.089071] env[65918]: _check_db_access() [ 574.089071] env[65918]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 574.089071] env[65918]: stacktrace = ''.join(traceback.format_stack()) [ 574.089071] env[65918]: [ 574.089490] env[65918]: WARNING nova.objects.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 574.089651] env[65918]: WARNING nova.objects.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] Failed to get minimum service version for cell d5b0d2e9-df60-4e9f-a2f8-a90735ec843d [ 574.090118] env[65918]: DEBUG oslo_concurrency.lockutils [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] Acquiring lock "singleton_lock" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 574.090330] env[65918]: DEBUG oslo_concurrency.lockutils [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] Acquired lock "singleton_lock" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 574.090606] env[65918]: DEBUG oslo_concurrency.lockutils [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] Releasing lock "singleton_lock" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 574.090994] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] Full set of CONF: {{(pid=65918) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 574.091187] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ******************************************************************************** {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 574.091346] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] Configuration options gathered from: {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 574.091518] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 574.091774] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 574.091913] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ================================================================================ {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 574.092190] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] allow_resize_to_same_host = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.092417] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] arq_binding_timeout = 300 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.092581] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] backdoor_port = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.092761] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] backdoor_socket = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.092968] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] block_device_allocate_retries = 60 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.093192] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] block_device_allocate_retries_interval = 3 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.093399] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cert = self.pem {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.093606] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.093835] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] compute_monitors = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.094094] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] config_dir = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.094325] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] config_drive_format = iso9660 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.094506] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.094694] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] config_source = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.094883] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] console_host = devstack {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.095125] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] control_exchange = nova {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.095334] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cpu_allocation_ratio = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.095512] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] daemon = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.095700] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] debug = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.095879] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] default_access_ip_network_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.096074] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] default_availability_zone = nova {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.096319] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] default_ephemeral_format = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.096610] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.096798] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] default_schedule_zone = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.096996] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] disk_allocation_ratio = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.097193] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] enable_new_services = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.097389] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] enabled_apis = ['osapi_compute'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.097588] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] enabled_ssl_apis = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.097815] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] flat_injected = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.097994] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] force_config_drive = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.098189] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] force_raw_images = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.098395] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] graceful_shutdown_timeout = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.098612] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] heal_instance_info_cache_interval = 60 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.098842] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] host = cpu-1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.099026] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.099264] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] initial_disk_allocation_ratio = 1.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.099487] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] initial_ram_allocation_ratio = 1.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.099722] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.099939] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] instance_build_timeout = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.100215] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] instance_delete_interval = 300 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.100443] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] instance_format = [instance: %(uuid)s] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.100697] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] instance_name_template = instance-%08x {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.100940] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] instance_usage_audit = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.101174] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] instance_usage_audit_period = month {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.101377] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.101561] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] instances_path = /opt/stack/data/nova/instances {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.101773] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] internal_service_availability_zone = internal {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.102038] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] key = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.102246] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] live_migration_retry_count = 30 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.102514] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] log_config_append = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.102742] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.102927] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] log_dir = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.103127] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] log_file = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.103275] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] log_options = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.103455] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] log_rotate_interval = 1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.103640] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] log_rotate_interval_type = days {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.103829] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] log_rotation_type = none {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.104032] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.104186] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.104425] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.104667] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.104822] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.105010] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] long_rpc_timeout = 1800 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.105214] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] max_concurrent_builds = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.105450] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] max_concurrent_live_migrations = 1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.105641] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] max_concurrent_snapshots = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.105821] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] max_local_block_devices = 3 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.106021] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] max_logfile_count = 30 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.106193] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] max_logfile_size_mb = 200 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.106420] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] maximum_instance_delete_attempts = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.106620] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] metadata_listen = 0.0.0.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.106815] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] metadata_listen_port = 8775 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.106999] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] metadata_workers = 2 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.107190] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] migrate_max_retries = -1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.107403] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] mkisofs_cmd = genisoimage {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.107642] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] my_block_storage_ip = 10.180.1.21 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.107845] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] my_ip = 10.180.1.21 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.108052] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] network_allocate_retries = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.108256] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.108484] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] osapi_compute_listen = 0.0.0.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.108699] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] osapi_compute_listen_port = 8774 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.108933] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] osapi_compute_unique_server_name_scope = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.109154] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] osapi_compute_workers = 2 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.109342] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] password_length = 12 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.109583] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] periodic_enable = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.109803] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] periodic_fuzzy_delay = 60 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.110010] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] pointer_model = usbtablet {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.110244] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] preallocate_images = none {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.110489] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] publish_errors = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.110656] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] pybasedir = /opt/stack/nova {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.110834] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ram_allocation_ratio = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.111046] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] rate_limit_burst = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.111259] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] rate_limit_except_level = CRITICAL {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.111465] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] rate_limit_interval = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.111686] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] reboot_timeout = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.111868] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] reclaim_instance_interval = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.112118] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] record = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.112328] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] reimage_timeout_per_gb = 60 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.112514] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] report_interval = 120 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.112680] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] rescue_timeout = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.112856] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] reserved_host_cpus = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.113057] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] reserved_host_disk_mb = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.113229] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] reserved_host_memory_mb = 512 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.113396] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] reserved_huge_pages = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.113557] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] resize_confirm_window = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.113718] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] resize_fs_using_block_device = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.113876] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] resume_guests_state_on_host_boot = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.114071] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.114258] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] rpc_response_timeout = 60 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.114443] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] run_external_periodic_tasks = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.114618] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] running_deleted_instance_action = reap {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.114842] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] running_deleted_instance_poll_interval = 1800 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.115081] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] running_deleted_instance_timeout = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.115309] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] scheduler_instance_sync_interval = 120 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.115460] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] service_down_time = 300 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.115645] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] servicegroup_driver = db {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.115827] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] shelved_offload_time = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.116031] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] shelved_poll_interval = 3600 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.116265] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] shutdown_timeout = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.116486] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] source_is_ipv6 = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.116727] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ssl_only = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.117050] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.117242] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] sync_power_state_interval = 600 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.117430] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] sync_power_state_pool_size = 1000 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.117677] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] syslog_log_facility = LOG_USER {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.117893] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] tempdir = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.118136] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] timeout_nbd = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.118373] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] transport_url = **** {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.118553] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] update_resources_interval = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.118762] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] use_cow_images = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.118947] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] use_eventlog = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.119170] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] use_journal = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.119367] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] use_json = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.119534] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] use_rootwrap_daemon = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.119697] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] use_stderr = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.119858] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] use_syslog = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.120027] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vcpu_pin_set = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.120244] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vif_plugging_is_fatal = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.120442] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vif_plugging_timeout = 300 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.120616] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] virt_mkfs = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.120779] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] volume_usage_poll_interval = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.120942] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] watch_log_file = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.121217] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] web = /usr/share/spice-html5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 574.121418] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_concurrency.disable_process_locking = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.121759] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.121950] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.122134] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.122311] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.122485] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.122653] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.122835] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.auth_strategy = keystone {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.123022] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.compute_link_prefix = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.123194] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.123371] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.dhcp_domain = novalocal {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.123542] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.enable_instance_password = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.123707] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.glance_link_prefix = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.123870] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.124049] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.124218] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.instance_list_per_project_cells = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.124383] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.list_records_by_skipping_down_cells = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.124547] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.local_metadata_per_cell = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.124714] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.max_limit = 1000 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.124880] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.metadata_cache_expiration = 15 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.125065] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.neutron_default_tenant_id = default {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.125239] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.use_forwarded_for = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.125407] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.use_neutron_default_nets = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.125574] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.125737] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.125902] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.126087] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.126263] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.vendordata_dynamic_targets = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.126432] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.vendordata_jsonfile_path = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.126612] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.126803] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.backend = dogpile.cache.memcached {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.126971] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.backend_argument = **** {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.127160] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.config_prefix = cache.oslo {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.127333] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.dead_timeout = 60.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.127497] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.debug_cache_backend = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.127659] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.enable_retry_client = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.127822] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.enable_socket_keepalive = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.127991] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.enabled = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.128167] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.expiration_time = 600 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.128333] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.hashclient_retry_attempts = 2 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.128497] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.hashclient_retry_delay = 1.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.128660] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.memcache_dead_retry = 300 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.128827] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.memcache_password = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.128992] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.129188] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.129369] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.memcache_pool_maxsize = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.129535] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.129697] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.memcache_sasl_enabled = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.129874] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.130052] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.memcache_socket_timeout = 1.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.130265] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.memcache_username = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.130446] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.proxies = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.130611] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.retry_attempts = 2 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.130775] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.retry_delay = 0.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.130939] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.socket_keepalive_count = 1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.131114] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.socket_keepalive_idle = 1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.131280] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.socket_keepalive_interval = 1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.131446] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.tls_allowed_ciphers = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.131621] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.tls_cafile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.131781] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.tls_certfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.131942] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.tls_enabled = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.132112] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cache.tls_keyfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.132318] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cinder.auth_section = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.132508] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cinder.auth_type = password {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.132673] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cinder.cafile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.132854] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cinder.catalog_info = volumev3::publicURL {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.133026] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cinder.certfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.133200] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cinder.collect_timing = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.133365] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cinder.cross_az_attach = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.133529] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cinder.debug = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.133691] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cinder.endpoint_template = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.133855] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cinder.http_retries = 3 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.134027] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cinder.insecure = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.134193] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cinder.keyfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.134366] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cinder.os_region_name = RegionOne {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.134530] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cinder.split_loggers = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.134690] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cinder.timeout = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.134862] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.135032] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] compute.cpu_dedicated_set = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.135197] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] compute.cpu_shared_set = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.135365] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] compute.image_type_exclude_list = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.135529] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.135693] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] compute.max_concurrent_disk_ops = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.135856] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] compute.max_disk_devices_to_attach = -1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.136029] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.136204] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.136368] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] compute.resource_provider_association_refresh = 300 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.136529] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] compute.shutdown_retry_interval = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.136707] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.136885] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] conductor.workers = 2 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.137072] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] console.allowed_origins = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.137236] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] console.ssl_ciphers = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.137407] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] console.ssl_minimum_version = default {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.137578] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] consoleauth.token_ttl = 600 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.137746] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.cafile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.137904] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.certfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.138080] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.collect_timing = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.138242] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.connect_retries = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.138403] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.connect_retry_delay = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.138562] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.endpoint_override = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.138725] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.insecure = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.138881] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.keyfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.139049] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.max_version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.139243] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.min_version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.139421] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.region_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.139581] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.service_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.139751] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.service_type = accelerator {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.139913] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.split_loggers = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.140093] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.status_code_retries = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.140279] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.status_code_retry_delay = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.140444] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.timeout = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.140625] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.140787] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] cyborg.version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.140973] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.backend = sqlalchemy {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.141166] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.connection = **** {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.141343] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.connection_debug = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.141553] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.connection_parameters = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.141786] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.connection_recycle_time = 3600 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.141976] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.connection_trace = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.142161] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.db_inc_retry_interval = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.142334] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.db_max_retries = 20 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.142501] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.db_max_retry_interval = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.142670] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.db_retry_interval = 1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.142838] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.max_overflow = 50 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.143017] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.max_pool_size = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.143215] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.max_retries = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.143386] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.mysql_enable_ndb = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.143557] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.143720] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.mysql_wsrep_sync_wait = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.143884] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.pool_timeout = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.144067] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.retry_interval = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.144234] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.slave_connection = **** {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.144447] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.sqlite_synchronous = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.144623] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] database.use_db_reconnect = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.144866] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.backend = sqlalchemy {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.145074] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.connection = **** {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.145254] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.connection_debug = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.145431] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.connection_parameters = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.145594] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.connection_recycle_time = 3600 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.145761] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.connection_trace = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.145924] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.db_inc_retry_interval = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.146099] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.db_max_retries = 20 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.146267] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.db_max_retry_interval = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.146432] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.db_retry_interval = 1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.146600] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.max_overflow = 50 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.146765] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.max_pool_size = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.146936] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.max_retries = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.147115] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.mysql_enable_ndb = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.147289] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.147449] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.147636] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.pool_timeout = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.147855] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.retry_interval = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.148036] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.slave_connection = **** {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.148233] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] api_database.sqlite_synchronous = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.148430] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] devices.enabled_mdev_types = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.148637] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.148824] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ephemeral_storage_encryption.enabled = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.148990] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.149199] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.api_servers = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.149410] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.cafile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.149608] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.certfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.151371] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.collect_timing = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.151567] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.connect_retries = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.151743] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.connect_retry_delay = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.151920] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.debug = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.152113] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.default_trusted_certificate_ids = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.152291] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.enable_certificate_validation = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.152462] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.enable_rbd_download = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.152630] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.endpoint_override = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.152802] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.insecure = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.152971] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.keyfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.153152] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.max_version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.153318] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.min_version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.153491] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.num_retries = 3 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.153664] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.rbd_ceph_conf = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.153835] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.rbd_connect_timeout = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.154016] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.rbd_pool = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.154197] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.rbd_user = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.154365] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.region_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.154530] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.service_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.154703] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.service_type = image {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.154870] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.split_loggers = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.155042] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.status_code_retries = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.155211] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.status_code_retry_delay = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.155379] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.timeout = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.155567] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.155737] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.verify_glance_signatures = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.155902] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] glance.version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.156087] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] guestfs.debug = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.156270] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.config_drive_cdrom = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.156471] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.config_drive_inject_password = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.156655] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.156826] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.enable_instance_metrics_collection = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.156993] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.enable_remotefx = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.157209] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.instances_path_share = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.157457] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.iscsi_initiator_list = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.157525] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.limit_cpu_features = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.157700] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.157865] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.158048] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.power_state_check_timeframe = 60 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.159127] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.159127] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.159127] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.use_multipath_io = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.159127] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.volume_attach_retry_count = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.159127] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.159127] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.vswitch_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.159449] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.159449] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] mks.enabled = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.159823] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.160023] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] image_cache.manager_interval = 2400 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.160221] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] image_cache.precache_concurrency = 1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.160408] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] image_cache.remove_unused_base_images = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.160581] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.160750] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.160930] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] image_cache.subdirectory_name = _base {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.161128] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.api_max_retries = 60 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.161300] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.api_retry_interval = 2 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.161481] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.auth_section = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.161664] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.auth_type = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.161830] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.cafile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.161991] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.certfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.162176] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.collect_timing = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.162343] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.connect_retries = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.162504] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.connect_retry_delay = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.162661] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.endpoint_override = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.162824] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.insecure = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.162983] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.keyfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.163157] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.max_version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.163320] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.min_version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.163480] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.partition_key = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.163648] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.peer_list = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.163805] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.region_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.163971] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.serial_console_state_timeout = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.164144] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.service_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.164316] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.service_type = baremetal {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.164478] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.split_loggers = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.164638] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.status_code_retries = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.164796] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.status_code_retry_delay = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.164954] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.timeout = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.165148] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.165340] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ironic.version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.165501] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.165676] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] key_manager.fixed_key = **** {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.165858] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.166029] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.barbican_api_version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.166195] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.barbican_endpoint = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.166371] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.barbican_endpoint_type = public {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.166533] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.barbican_region_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.166695] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.cafile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.166856] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.certfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.167028] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.collect_timing = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.167200] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.insecure = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.167361] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.keyfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.167526] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.number_of_retries = 60 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.167692] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.retry_delay = 1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.167858] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.send_service_user_token = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.168030] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.split_loggers = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.168198] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.timeout = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.168366] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.verify_ssl = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.168565] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican.verify_ssl_path = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.168745] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican_service_user.auth_section = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.168910] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican_service_user.auth_type = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.169084] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican_service_user.cafile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.169280] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican_service_user.certfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.169457] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican_service_user.collect_timing = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.169623] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican_service_user.insecure = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.169785] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican_service_user.keyfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.169951] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican_service_user.split_loggers = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.170128] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] barbican_service_user.timeout = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.170303] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.approle_role_id = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.170499] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.approle_secret_id = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.170618] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.cafile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.170776] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.certfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.170938] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.collect_timing = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.171113] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.insecure = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.171278] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.keyfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.171462] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.kv_mountpoint = secret {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.171643] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.kv_version = 2 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.171808] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.namespace = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.171971] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.root_token_id = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.172153] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.split_loggers = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.172318] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.ssl_ca_crt_file = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.172479] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.timeout = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.172645] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.use_ssl = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.172817] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.172985] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.cafile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.173162] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.certfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.173331] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.collect_timing = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.173490] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.connect_retries = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.173650] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.connect_retry_delay = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.173809] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.endpoint_override = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.173973] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.insecure = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.174142] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.keyfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.174301] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.max_version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.174458] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.min_version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.174613] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.region_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.174770] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.service_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.174938] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.service_type = identity {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.175112] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.split_loggers = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.175273] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.status_code_retries = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.175430] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.status_code_retry_delay = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.175587] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.timeout = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.175766] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.175924] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] keystone.version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.176136] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.connection_uri = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.176302] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.cpu_mode = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.176528] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.cpu_model_extra_flags = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.176715] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.cpu_models = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.176890] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.cpu_power_governor_high = performance {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.177073] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.cpu_power_governor_low = powersave {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.177248] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.cpu_power_management = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.177424] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.177589] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.device_detach_attempts = 8 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.177751] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.device_detach_timeout = 20 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.177916] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.disk_cachemodes = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.178092] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.disk_prefix = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.178260] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.enabled_perf_events = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.178424] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.file_backed_memory = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.178589] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.gid_maps = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.178747] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.hw_disk_discard = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.178907] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.hw_machine_type = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.179089] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.images_rbd_ceph_conf = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.179297] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.179478] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.179649] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.images_rbd_glance_store_name = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.179818] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.images_rbd_pool = rbd {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.179987] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.images_type = default {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.180187] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.images_volume_group = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.180370] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.inject_key = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.180536] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.inject_partition = -2 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.180703] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.inject_password = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.180865] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.iscsi_iface = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.181038] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.iser_use_multipath = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.181208] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.live_migration_bandwidth = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.181371] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.181537] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.live_migration_downtime = 500 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.181699] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.181861] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.182027] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.live_migration_inbound_addr = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.182195] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.182360] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.live_migration_permit_post_copy = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.182530] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.live_migration_scheme = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.182709] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.live_migration_timeout_action = abort {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.182876] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.live_migration_tunnelled = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.183044] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.live_migration_uri = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.183214] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.live_migration_with_native_tls = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.183375] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.max_queues = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.183539] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.183699] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.nfs_mount_options = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.184012] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.184194] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.184362] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.num_iser_scan_tries = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.184523] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.num_memory_encrypted_guests = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.184688] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.184852] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.num_pcie_ports = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.185029] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.num_volume_scan_tries = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.185195] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.pmem_namespaces = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.185357] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.quobyte_client_cfg = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.185642] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.185815] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.rbd_connect_timeout = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.185980] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.186160] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.186324] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.rbd_secret_uuid = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.186485] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.rbd_user = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.186648] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.186819] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.remote_filesystem_transport = ssh {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.186978] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.rescue_image_id = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.187152] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.rescue_kernel_id = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.187311] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.rescue_ramdisk_id = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.187482] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.187641] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.rx_queue_size = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.187810] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.smbfs_mount_options = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.188099] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.188278] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.snapshot_compression = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.188448] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.snapshot_image_format = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.188669] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.188840] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.sparse_logical_volumes = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.189022] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.swtpm_enabled = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.189222] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.swtpm_group = tss {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.189397] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.swtpm_user = tss {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.189572] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.sysinfo_serial = unique {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.189737] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.tx_queue_size = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.189905] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.uid_maps = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.190088] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.use_virtio_for_bridges = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.190288] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.virt_type = kvm {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.190466] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.volume_clear = zero {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.190633] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.volume_clear_size = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.190804] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.volume_use_multipath = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.190963] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.vzstorage_cache_path = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.191145] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.191321] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.vzstorage_mount_group = qemu {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.191496] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.vzstorage_mount_opts = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.191660] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.191940] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.192138] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.vzstorage_mount_user = stack {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.192329] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.192507] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.auth_section = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.192684] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.auth_type = password {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.192846] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.cafile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.193024] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.certfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.193189] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.collect_timing = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.193350] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.connect_retries = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.193519] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.connect_retry_delay = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.193682] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.default_floating_pool = public {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.193842] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.endpoint_override = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.194013] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.extension_sync_interval = 600 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.194185] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.http_retries = 3 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.194349] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.insecure = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.194509] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.keyfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.194669] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.max_version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.194838] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.194997] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.min_version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.195182] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.ovs_bridge = br-int {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.195354] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.physnets = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.195526] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.region_name = RegionOne {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.195697] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.service_metadata_proxy = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.195859] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.service_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.196040] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.service_type = network {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.196213] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.split_loggers = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.196376] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.status_code_retries = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.196534] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.status_code_retry_delay = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.196693] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.timeout = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.196875] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.197047] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] neutron.version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.197223] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] notifications.bdms_in_notifications = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.197404] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] notifications.default_level = INFO {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.197581] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] notifications.notification_format = unversioned {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.197747] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] notifications.notify_on_state_change = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.197923] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.198112] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] pci.alias = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.198289] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] pci.device_spec = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.198458] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] pci.report_in_placement = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.198631] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.auth_section = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.198804] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.auth_type = password {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.198973] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.199168] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.cafile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.199333] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.certfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.199501] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.collect_timing = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.199660] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.connect_retries = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.199818] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.connect_retry_delay = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.199973] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.default_domain_id = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.200162] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.default_domain_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.200338] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.domain_id = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.200498] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.domain_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.200658] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.endpoint_override = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.200829] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.insecure = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.200978] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.keyfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.201152] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.max_version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.201314] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.min_version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.201481] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.password = **** {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.201641] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.project_domain_id = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.201811] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.project_domain_name = Default {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.201979] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.project_id = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.202168] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.project_name = service {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.202341] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.region_name = RegionOne {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.202502] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.service_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.202671] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.service_type = placement {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.202837] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.split_loggers = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.202997] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.status_code_retries = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.203171] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.status_code_retry_delay = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.203333] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.system_scope = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.203491] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.timeout = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.203649] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.trust_id = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.203805] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.user_domain_id = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.203972] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.user_domain_name = Default {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.204148] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.user_id = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.204350] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.username = placement {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.204539] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.204702] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] placement.version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.204881] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] quota.cores = 20 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.205059] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] quota.count_usage_from_placement = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.205237] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.205416] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] quota.injected_file_content_bytes = 10240 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.205585] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] quota.injected_file_path_length = 255 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.205753] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] quota.injected_files = 5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.205919] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] quota.instances = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.206097] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] quota.key_pairs = 100 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.206266] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] quota.metadata_items = 128 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.206435] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] quota.ram = 51200 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.206600] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] quota.recheck_quota = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.206767] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] quota.server_group_members = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.206932] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] quota.server_groups = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.207117] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] rdp.enabled = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.207437] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.207628] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.207801] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.207969] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] scheduler.image_metadata_prefilter = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.208148] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.208321] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] scheduler.max_attempts = 3 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.208486] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] scheduler.max_placement_results = 1000 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.208653] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.208819] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] scheduler.query_placement_for_availability_zone = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.208983] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] scheduler.query_placement_for_image_type_support = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.209177] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.209366] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] scheduler.workers = 2 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.209543] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.209716] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.209896] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.210080] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.210287] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.210463] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.210629] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.210822] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.210993] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.host_subset_size = 1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.211170] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.211338] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.211528] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.isolated_hosts = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.211707] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.isolated_images = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.211874] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.212050] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.212217] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.pci_in_placement = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.212385] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.212551] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.212720] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.212888] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.213067] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.213242] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.213413] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.track_instance_changes = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.213597] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.213773] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] metrics.required = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.213941] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] metrics.weight_multiplier = 1.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.214121] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.214291] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] metrics.weight_setting = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.214599] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.214778] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] serial_console.enabled = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.214957] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] serial_console.port_range = 10000:20000 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.215144] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.215321] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.215494] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] serial_console.serialproxy_port = 6083 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.215662] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] service_user.auth_section = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.215835] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] service_user.auth_type = password {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.215998] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] service_user.cafile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.216174] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] service_user.certfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.216371] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] service_user.collect_timing = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.216540] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] service_user.insecure = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.216705] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] service_user.keyfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.216878] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] service_user.send_service_user_token = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.217056] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] service_user.split_loggers = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.217230] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] service_user.timeout = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.217402] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] spice.agent_enabled = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.217582] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] spice.enabled = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.217882] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.218089] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.218273] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] spice.html5proxy_port = 6082 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.218439] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] spice.image_compression = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.218601] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] spice.jpeg_compression = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.218763] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] spice.playback_compression = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.218934] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] spice.server_listen = 127.0.0.1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.219141] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.219316] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] spice.streaming_mode = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.219485] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] spice.zlib_compression = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.219651] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] upgrade_levels.baseapi = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.219813] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] upgrade_levels.cert = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.219987] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] upgrade_levels.compute = auto {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.220186] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] upgrade_levels.conductor = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.220370] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] upgrade_levels.scheduler = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.220545] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vendordata_dynamic_auth.auth_section = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.220714] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vendordata_dynamic_auth.auth_type = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.220877] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vendordata_dynamic_auth.cafile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.221050] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vendordata_dynamic_auth.certfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.221223] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.221389] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vendordata_dynamic_auth.insecure = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.221553] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vendordata_dynamic_auth.keyfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.221717] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.221878] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vendordata_dynamic_auth.timeout = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.222065] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.api_retry_count = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.222233] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.ca_file = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.222409] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.cache_prefix = devstack-image-cache {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.222578] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.cluster_name = testcl1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.222745] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.connection_pool_size = 10 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.222906] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.console_delay_seconds = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.223087] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.datastore_regex = ^datastore.* {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.223296] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.223476] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.host_password = **** {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.223644] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.host_port = 443 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.223812] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.host_username = administrator@vsphere.local {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.223981] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.insecure = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.224158] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.integration_bridge = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.224329] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.maximum_objects = 100 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.224493] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.pbm_default_policy = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.224658] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.pbm_enabled = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.224819] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.pbm_wsdl_location = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.224991] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.225169] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.serial_port_proxy_uri = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.225333] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.serial_port_service_uri = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.225502] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.task_poll_interval = 0.5 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.225675] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.use_linked_clone = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.225847] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.vnc_keymap = en-us {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.226026] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.vnc_port = 5900 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.226199] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vmware.vnc_port_total = 10000 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.226389] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vnc.auth_schemes = ['none'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.226566] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vnc.enabled = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.226851] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.227044] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.227223] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vnc.novncproxy_port = 6080 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.227402] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vnc.server_listen = 127.0.0.1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.227577] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.227738] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vnc.vencrypt_ca_certs = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.227899] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vnc.vencrypt_client_cert = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.228071] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vnc.vencrypt_client_key = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.228271] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.228461] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.disable_deep_image_inspection = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.228631] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.228796] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.228958] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.229155] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.disable_rootwrap = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.229337] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.enable_numa_live_migration = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.229503] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.229668] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.229831] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.229993] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.libvirt_disable_apic = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.230189] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.230368] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.230533] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.230698] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.230862] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.231053] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.231198] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.231360] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.231521] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.231686] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.231871] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.232049] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] wsgi.client_socket_timeout = 900 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.232220] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] wsgi.default_pool_size = 1000 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.232388] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] wsgi.keep_alive = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.232555] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] wsgi.max_header_line = 16384 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.232717] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] wsgi.secure_proxy_ssl_header = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.232878] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] wsgi.ssl_ca_file = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.233047] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] wsgi.ssl_cert_file = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.233214] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] wsgi.ssl_key_file = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.233381] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] wsgi.tcp_keepidle = 600 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.233553] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.233717] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] zvm.ca_file = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.233878] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] zvm.cloud_connector_url = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.234173] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.234353] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] zvm.reachable_timeout = 300 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.234538] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_policy.enforce_new_defaults = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.234711] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_policy.enforce_scope = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.234888] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_policy.policy_default_rule = default {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.235082] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.235265] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_policy.policy_file = policy.yaml {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.235440] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.235603] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.235764] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.235927] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.236103] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.236281] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.236461] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.236639] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] profiler.connection_string = messaging:// {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.236808] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] profiler.enabled = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.236979] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] profiler.es_doc_type = notification {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.237161] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] profiler.es_scroll_size = 10000 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.237335] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] profiler.es_scroll_time = 2m {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.237499] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] profiler.filter_error_trace = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.237668] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] profiler.hmac_keys = SECRET_KEY {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.237836] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] profiler.sentinel_service_name = mymaster {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.238015] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] profiler.socket_timeout = 0.1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.238182] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] profiler.trace_sqlalchemy = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.238374] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] remote_debug.host = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.238546] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] remote_debug.port = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.238725] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.238894] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.239071] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.239263] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.239434] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.239599] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.239763] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.239929] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.240113] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.240304] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.240482] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.240655] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.240829] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.241011] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.241182] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.241363] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.241529] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.241692] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.241860] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.242036] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.242207] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.242380] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.242544] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.242709] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.242881] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.243085] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.ssl = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.243272] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.243450] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.243618] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.243793] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.243965] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_rabbit.ssl_version = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.244169] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.244342] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_notifications.retry = -1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.244525] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.244698] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_messaging_notifications.transport_url = **** {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.244869] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.auth_section = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.245042] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.auth_type = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.245208] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.cafile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.245370] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.certfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.245534] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.collect_timing = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.245695] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.connect_retries = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.245855] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.connect_retry_delay = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.246022] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.endpoint_id = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.246185] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.endpoint_override = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.246347] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.insecure = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.246506] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.keyfile = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.246663] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.max_version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.246818] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.min_version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.246976] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.region_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.247150] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.service_name = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.247311] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.service_type = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.247476] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.split_loggers = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.247632] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.status_code_retries = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.247793] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.status_code_retry_delay = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.247953] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.timeout = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.248125] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.valid_interfaces = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.248287] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_limit.version = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.248454] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_reports.file_event_handler = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.248619] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.248780] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] oslo_reports.log_dir = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.248949] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.249138] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.249313] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.249484] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.249648] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.249810] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.249980] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.250183] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vif_plug_ovs_privileged.group = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.250363] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.250534] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.250698] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.250861] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] vif_plug_ovs_privileged.user = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.251042] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_vif_linux_bridge.flat_interface = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.251232] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.251402] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.251576] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.251751] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.251918] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.252107] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.252263] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.252444] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_vif_ovs.isolate_vif = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.252611] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.252780] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.252950] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.253135] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_vif_ovs.ovsdb_interface = native {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.253302] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_vif_ovs.per_port_bridge = False {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.253466] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_brick.lock_path = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.253631] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.253796] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.253964] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] privsep_osbrick.capabilities = [21] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.254139] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] privsep_osbrick.group = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.254301] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] privsep_osbrick.helper_command = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.254466] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.254630] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.254788] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] privsep_osbrick.user = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.254960] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.255132] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] nova_sys_admin.group = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.255293] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] nova_sys_admin.helper_command = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.255458] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.255620] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.255779] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] nova_sys_admin.user = None {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 574.255908] env[65918]: DEBUG oslo_service.service [None req-697f8c6a-d896-4918-b073-91ddb9f6c705 None None] ******************************************************************************** {{(pid=65918) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 574.256622] env[65918]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 574.264909] env[65918]: INFO nova.virt.node [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Generated node identity 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 [ 574.265153] env[65918]: INFO nova.virt.node [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Wrote node identity 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 to /opt/stack/data/n-cpu-1/compute_id [ 574.276041] env[65918]: WARNING nova.compute.manager [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Compute nodes ['0bcf3fd3-93ee-4c0a-abed-95169e714cc4'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 574.306321] env[65918]: INFO nova.compute.manager [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 574.326094] env[65918]: WARNING nova.compute.manager [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 574.326313] env[65918]: DEBUG oslo_concurrency.lockutils [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 574.326526] env[65918]: DEBUG oslo_concurrency.lockutils [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 574.326672] env[65918]: DEBUG oslo_concurrency.lockutils [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 574.326830] env[65918]: DEBUG nova.compute.resource_tracker [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65918) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 574.327936] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1143ea5-3b49-4b77-84d5-4246dce6a19a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.336550] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9649e08c-425e-46d9-b28b-6c14799d8641 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.350020] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-742ea224-542c-4628-a68d-09abd2bc7455 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.355960] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec63c876-e73f-4074-9183-2c86d1f1e752 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.384908] env[65918]: DEBUG nova.compute.resource_tracker [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181087MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65918) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 574.385089] env[65918]: DEBUG oslo_concurrency.lockutils [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 574.385255] env[65918]: DEBUG oslo_concurrency.lockutils [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 574.396677] env[65918]: WARNING nova.compute.resource_tracker [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] No compute node record for cpu-1:0bcf3fd3-93ee-4c0a-abed-95169e714cc4: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 could not be found. [ 574.408431] env[65918]: INFO nova.compute.resource_tracker [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 [ 574.459470] env[65918]: DEBUG nova.compute.resource_tracker [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 574.459594] env[65918]: DEBUG nova.compute.resource_tracker [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 574.559525] env[65918]: INFO nova.scheduler.client.report [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] [req-41c84491-1778-47be-8a76-49da5bdbe48b] Created resource provider record via placement API for resource provider with UUID 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 574.575615] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bf23961-c080-4202-9071-d2256a85692c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.582799] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccf84cc8-dfee-4999-991c-702e1fbbbb28 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.612111] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f057f3e3-b170-478c-a6c4-1887e3bf9fd5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.618689] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73ea2a36-733e-47ca-ab0f-399eb4c85ebf {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.631312] env[65918]: DEBUG nova.compute.provider_tree [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Updating inventory in ProviderTree for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 574.665355] env[65918]: DEBUG nova.scheduler.client.report [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Updated inventory for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 574.665584] env[65918]: DEBUG nova.compute.provider_tree [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Updating resource provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 generation from 0 to 1 during operation: update_inventory {{(pid=65918) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 574.665729] env[65918]: DEBUG nova.compute.provider_tree [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Updating inventory in ProviderTree for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 574.708887] env[65918]: DEBUG nova.compute.provider_tree [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Updating resource provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 generation from 1 to 2 during operation: update_traits {{(pid=65918) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 574.725670] env[65918]: DEBUG nova.compute.resource_tracker [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65918) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 574.725903] env[65918]: DEBUG oslo_concurrency.lockutils [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.341s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 574.725991] env[65918]: DEBUG nova.service [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Creating RPC server for service compute {{(pid=65918) start /opt/stack/nova/nova/service.py:182}} [ 574.737963] env[65918]: DEBUG nova.service [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] Join ServiceGroup membership for this service compute {{(pid=65918) start /opt/stack/nova/nova/service.py:199}} [ 574.738165] env[65918]: DEBUG nova.servicegroup.drivers.db [None req-bc8d321c-90c7-4644-aa86-1e2519be4d3d None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=65918) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 595.742927] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._sync_power_states {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 595.753248] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Getting list of instances from cluster (obj){ [ 595.753248] env[65918]: value = "domain-c8" [ 595.753248] env[65918]: _type = "ClusterComputeResource" [ 595.753248] env[65918]: } {{(pid=65918) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 595.754378] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdbeb4c8-1a91-44ee-85f7-507e0dfb5b5c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.763499] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Got total of 0 instances {{(pid=65918) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 595.763716] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 595.764015] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Getting list of instances from cluster (obj){ [ 595.764015] env[65918]: value = "domain-c8" [ 595.764015] env[65918]: _type = "ClusterComputeResource" [ 595.764015] env[65918]: } {{(pid=65918) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 595.764840] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3c13b53-f4de-485e-bbc1-3bef2690d431 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.772248] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Got total of 0 instances {{(pid=65918) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 616.563515] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Acquiring lock "2efc86dd-575c-4d78-a5ca-592f077655de" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 616.563812] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Lock "2efc86dd-575c-4d78-a5ca-592f077655de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 616.580663] env[65918]: DEBUG nova.compute.manager [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 616.677322] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 616.677534] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 616.679108] env[65918]: INFO nova.compute.claims [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 616.810930] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ace63062-b397-4aa5-ba18-baf8bf729ac1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.822353] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af82d093-947c-41bf-95c6-cb2fb982f628 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.861714] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93950196-a1b6-433a-9d25-73e97d62417b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.869607] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b52465e-0dc4-4d60-8028-09767cef24e6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.884539] env[65918]: DEBUG nova.compute.provider_tree [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 616.896107] env[65918]: DEBUG nova.scheduler.client.report [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 616.919123] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.241s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 616.919767] env[65918]: DEBUG nova.compute.manager [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 616.966623] env[65918]: DEBUG nova.compute.utils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 616.968014] env[65918]: DEBUG nova.compute.manager [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Not allocating networking since 'none' was specified. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 616.988221] env[65918]: DEBUG nova.compute.manager [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 617.076407] env[65918]: DEBUG nova.compute.manager [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 617.199132] env[65918]: DEBUG nova.virt.hardware [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 617.199403] env[65918]: DEBUG nova.virt.hardware [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 617.199561] env[65918]: DEBUG nova.virt.hardware [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 617.199750] env[65918]: DEBUG nova.virt.hardware [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 617.199897] env[65918]: DEBUG nova.virt.hardware [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 617.200217] env[65918]: DEBUG nova.virt.hardware [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 617.200340] env[65918]: DEBUG nova.virt.hardware [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 617.200452] env[65918]: DEBUG nova.virt.hardware [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 617.200818] env[65918]: DEBUG nova.virt.hardware [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 617.201012] env[65918]: DEBUG nova.virt.hardware [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 617.201440] env[65918]: DEBUG nova.virt.hardware [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 617.203571] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2583b7e0-2c12-4c17-adad-421867f1add5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.212719] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0800e5c6-1358-4e82-8159-631b2068977b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.229964] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-531f2da8-bd0c-43c2-962b-e62a525273e1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.250948] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Instance VIF info [] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 617.261701] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 617.262068] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5ef2f8a4-6f89-433b-b7b5-7c86140dd242 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.275650] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Created folder: OpenStack in parent group-v4. [ 617.275650] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Creating folder: Project (3d1d8f2fc90f4433a537cfa6134067df). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 617.277307] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ebb7fc97-6351-49f8-be96-cf3cb0a327ab {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.288056] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Created folder: Project (3d1d8f2fc90f4433a537cfa6134067df) in parent group-v572679. [ 617.288056] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Creating folder: Instances. Parent ref: group-v572680. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 617.288262] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9ea7e41b-9866-4b69-b8a7-161e558d14da {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.297399] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Created folder: Instances in parent group-v572680. [ 617.297675] env[65918]: DEBUG oslo.service.loopingcall [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 617.297859] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 617.298054] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-20f16124-d470-42dd-926a-09ac28c25cad {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.314587] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 617.314587] env[65918]: value = "task-2848133" [ 617.314587] env[65918]: _type = "Task" [ 617.314587] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 617.324242] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848133, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 617.368992] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Acquiring lock "d34229ba-b110-41aa-b68f-e2d107fd817e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.369277] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Lock "d34229ba-b110-41aa-b68f-e2d107fd817e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.391509] env[65918]: DEBUG nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 617.455884] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 617.455884] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 617.457390] env[65918]: INFO nova.compute.claims [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 617.616239] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1506b2ee-7bae-401d-8858-5e91a9726a9f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.623831] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34b2fa4e-17bf-468e-b47d-6cbeffa32309 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.661276] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00472dc6-ebf3-4eb7-83be-36327958989f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.670708] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22fb59ae-e782-45ae-aa24-846456a984c6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.689469] env[65918]: DEBUG nova.compute.provider_tree [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 617.708081] env[65918]: DEBUG nova.scheduler.client.report [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 617.734353] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.277s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 617.734905] env[65918]: DEBUG nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 617.789286] env[65918]: DEBUG nova.compute.utils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 617.791609] env[65918]: DEBUG nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 617.792064] env[65918]: DEBUG nova.network.neutron [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 617.803497] env[65918]: DEBUG nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 617.829411] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848133, 'name': CreateVM_Task, 'duration_secs': 0.252878} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 617.829692] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 617.830592] env[65918]: DEBUG oslo_vmware.service [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52cbf545-8601-4c62-93ff-6bbd255356e8 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.843697] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 617.843888] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 617.844573] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 617.848273] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6c4a08c7-3084-4426-860f-3309b506534b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.859681] env[65918]: DEBUG oslo_vmware.api [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Waiting for the task: (returnval){ [ 617.859681] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52e55214-97fb-0431-1d3c-c6afaaf30058" [ 617.859681] env[65918]: _type = "Task" [ 617.859681] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 617.869065] env[65918]: DEBUG oslo_vmware.api [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52e55214-97fb-0431-1d3c-c6afaaf30058, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 617.902246] env[65918]: DEBUG nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 617.930903] env[65918]: DEBUG nova.virt.hardware [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 617.931913] env[65918]: DEBUG nova.virt.hardware [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 617.932138] env[65918]: DEBUG nova.virt.hardware [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 617.932288] env[65918]: DEBUG nova.virt.hardware [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 617.932426] env[65918]: DEBUG nova.virt.hardware [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 617.932562] env[65918]: DEBUG nova.virt.hardware [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 617.933216] env[65918]: DEBUG nova.virt.hardware [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 617.933216] env[65918]: DEBUG nova.virt.hardware [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 617.933216] env[65918]: DEBUG nova.virt.hardware [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 617.933347] env[65918]: DEBUG nova.virt.hardware [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 617.933412] env[65918]: DEBUG nova.virt.hardware [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 617.934299] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebe3f10e-bb32-4872-ae23-4149dcc9971e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 617.943405] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e302d21a-814e-4dba-a741-a258e6ddaed2 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.140439] env[65918]: DEBUG nova.policy [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5ae956f29e9e4ad499b78e0130d0adb7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f56a79d560ba41a09b75d24eb13e2470', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 618.373571] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 618.376022] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 618.376022] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 618.376022] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 618.376022] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 618.376475] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-501cfd10-099e-4aef-9ae3-cd20c63a0b0e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.399378] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 618.400240] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 618.401529] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80b790b2-f7fb-4a27-9c08-62f3f82d53cd {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.413494] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d2090188-e5aa-4db8-9ab5-31161e97d795 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.419276] env[65918]: DEBUG oslo_vmware.api [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Waiting for the task: (returnval){ [ 618.419276] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52420f1a-aba1-a8dc-4588-f4d605260642" [ 618.419276] env[65918]: _type = "Task" [ 618.419276] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 618.431878] env[65918]: DEBUG oslo_vmware.api [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52420f1a-aba1-a8dc-4588-f4d605260642, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 618.932472] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 618.933157] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Creating directory with path [datastore1] vmware_temp/d4bf6097-fc19-4c11-bfe6-98f4caf0b216/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 618.933157] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5ad8cbda-baa5-4b91-bdd9-c22a6a0317ab {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.965738] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Created directory with path [datastore1] vmware_temp/d4bf6097-fc19-4c11-bfe6-98f4caf0b216/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 618.967413] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Fetch image to [datastore1] vmware_temp/d4bf6097-fc19-4c11-bfe6-98f4caf0b216/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 618.967524] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/d4bf6097-fc19-4c11-bfe6-98f4caf0b216/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 618.968670] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec410b5a-f7bc-43ba-8d6a-cf544b914108 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.979317] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b38b25c4-e08f-469a-a6c4-e975d9268b5d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.993433] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-967f83a1-a877-4425-b40c-3d63b6825ea2 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.029235] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-613d2443-dbd7-4955-9c77-27d95ffc2a2d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.038224] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-64ea43a1-e328-4496-be35-a871b01ae3cd {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.066213] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 619.154444] env[65918]: DEBUG oslo_vmware.rw_handles [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d4bf6097-fc19-4c11-bfe6-98f4caf0b216/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 619.229684] env[65918]: DEBUG oslo_vmware.rw_handles [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Completed reading data from the image iterator. {{(pid=65918) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 619.229866] env[65918]: DEBUG oslo_vmware.rw_handles [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d4bf6097-fc19-4c11-bfe6-98f4caf0b216/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 619.512974] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Acquiring lock "169c3642-1229-4c49-9e04-67e4e1764286" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 619.512974] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Lock "169c3642-1229-4c49-9e04-67e4e1764286" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 619.538103] env[65918]: DEBUG nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 619.610434] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 619.610434] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 619.611899] env[65918]: INFO nova.compute.claims [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 619.760539] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dceeccb-96c7-484d-9241-63c9c19b9283 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.772870] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2203179e-1f33-4471-9a4e-cf5adad52626 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.806574] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d241290-2bf7-44fb-9310-0add20e620ea {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.814761] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2eeac452-a1ea-49d3-89e9-4f8df8e13650 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.829793] env[65918]: DEBUG nova.compute.provider_tree [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 619.844017] env[65918]: DEBUG nova.scheduler.client.report [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 619.865653] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 619.866290] env[65918]: DEBUG nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 619.919018] env[65918]: DEBUG nova.compute.utils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 619.923071] env[65918]: DEBUG nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 619.923071] env[65918]: DEBUG nova.network.neutron [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 619.943748] env[65918]: DEBUG nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 620.042876] env[65918]: DEBUG nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 620.064745] env[65918]: DEBUG nova.network.neutron [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Successfully created port: 30b38db4-2d87-4551-a4fe-bc7427cb87d5 {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 620.083886] env[65918]: DEBUG nova.virt.hardware [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 620.084136] env[65918]: DEBUG nova.virt.hardware [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 620.084289] env[65918]: DEBUG nova.virt.hardware [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 620.084470] env[65918]: DEBUG nova.virt.hardware [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 620.084594] env[65918]: DEBUG nova.virt.hardware [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 620.084731] env[65918]: DEBUG nova.virt.hardware [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 620.084930] env[65918]: DEBUG nova.virt.hardware [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 620.085090] env[65918]: DEBUG nova.virt.hardware [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 620.085249] env[65918]: DEBUG nova.virt.hardware [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 620.085403] env[65918]: DEBUG nova.virt.hardware [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 620.085566] env[65918]: DEBUG nova.virt.hardware [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 620.086743] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8c872e8-7598-4dbe-a3de-c9ce1b65aee4 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.095732] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e683499-5bd3-4eb3-bfa9-7cb3a6a2dbbb {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.299057] env[65918]: DEBUG nova.policy [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '33415e73be934e1ca80f458f50ab3533', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ffe525e4ef6f4d70afff3e304a002bea', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 620.329182] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Acquiring lock "cf0087c7-22d0-4317-a00a-73967ccafeaa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.329508] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Lock "cf0087c7-22d0-4317-a00a-73967ccafeaa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.339404] env[65918]: DEBUG nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 620.405129] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 620.405385] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 620.406858] env[65918]: INFO nova.compute.claims [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 620.564403] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef50e572-1ec3-4ce4-b8a2-2a22e2b9a587 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.573189] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60696e69-3428-4dc5-9a9c-f3cbf28b2877 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.612227] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6466059-0e87-4ec2-9247-db7a88ca6c4d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.621330] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29454189-1f23-4349-bc44-0c3af5d4f7ba {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.641259] env[65918]: DEBUG nova.compute.provider_tree [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 620.650576] env[65918]: DEBUG nova.scheduler.client.report [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 620.668397] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 620.670474] env[65918]: DEBUG nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 620.717563] env[65918]: DEBUG nova.compute.utils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 620.720124] env[65918]: DEBUG nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 620.720124] env[65918]: DEBUG nova.network.neutron [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 620.729729] env[65918]: DEBUG nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 620.820987] env[65918]: DEBUG nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 620.848078] env[65918]: DEBUG nova.virt.hardware [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 620.849634] env[65918]: DEBUG nova.virt.hardware [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 620.849634] env[65918]: DEBUG nova.virt.hardware [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 620.849634] env[65918]: DEBUG nova.virt.hardware [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 620.849634] env[65918]: DEBUG nova.virt.hardware [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 620.849634] env[65918]: DEBUG nova.virt.hardware [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 620.850160] env[65918]: DEBUG nova.virt.hardware [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 620.850160] env[65918]: DEBUG nova.virt.hardware [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 620.850160] env[65918]: DEBUG nova.virt.hardware [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 620.850160] env[65918]: DEBUG nova.virt.hardware [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 620.850160] env[65918]: DEBUG nova.virt.hardware [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 620.850988] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95156be8-7144-4b74-8509-4c897975ce18 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.862431] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ae7d7ae-7ee8-4226-8ee5-87a5dc9d1057 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.134221] env[65918]: DEBUG nova.policy [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f3bba2cf8fa041ad843e52c914d085c2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b3e200cfc29d449a8025e989582f916b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 621.977564] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquiring lock "7bc8087e-17e1-4cbd-84be-bd6c07e104ce" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 621.977564] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "7bc8087e-17e1-4cbd-84be-bd6c07e104ce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 621.993983] env[65918]: DEBUG nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 621.999457] env[65918]: DEBUG nova.network.neutron [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Successfully created port: ca75e443-247a-4fe0-b608-39335e73fbb4 {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 622.053860] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.054351] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.056185] env[65918]: INFO nova.compute.claims [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 622.237290] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-150dab3e-e336-43fd-bb9c-2784621934c5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.248398] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1460f44e-e110-41ef-8700-322ef9261e26 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.292904] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f63bbe4a-f687-402b-9976-38b66a5081b6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.303826] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6872dd95-ba0a-4825-bb95-2a24904afa4d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.323859] env[65918]: DEBUG nova.compute.provider_tree [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 622.337253] env[65918]: DEBUG nova.scheduler.client.report [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 622.354032] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.300s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.354565] env[65918]: DEBUG nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 622.432698] env[65918]: DEBUG nova.compute.utils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 622.437122] env[65918]: DEBUG nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 622.437378] env[65918]: DEBUG nova.network.neutron [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 622.450455] env[65918]: DEBUG nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 622.556347] env[65918]: DEBUG nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 622.581809] env[65918]: DEBUG nova.virt.hardware [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 622.582144] env[65918]: DEBUG nova.virt.hardware [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 622.583531] env[65918]: DEBUG nova.virt.hardware [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 622.583846] env[65918]: DEBUG nova.virt.hardware [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 622.583896] env[65918]: DEBUG nova.virt.hardware [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 622.584051] env[65918]: DEBUG nova.virt.hardware [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 622.584265] env[65918]: DEBUG nova.virt.hardware [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 622.584414] env[65918]: DEBUG nova.virt.hardware [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 622.584572] env[65918]: DEBUG nova.virt.hardware [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 622.584730] env[65918]: DEBUG nova.virt.hardware [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 622.584894] env[65918]: DEBUG nova.virt.hardware [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 622.585784] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-052146bc-ab31-4674-b2a6-9a10059b9617 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.596920] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27684887-99ef-4707-b1e0-117f30b3fd13 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.076026] env[65918]: DEBUG nova.policy [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd03dde4e89a94abe86e72cefd96eda08', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'aa2e2978302b4e76b809fddcba0eab40', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 623.703684] env[65918]: DEBUG nova.network.neutron [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Successfully created port: 2fc1c238-d490-4df5-8812-fb2be8ea6fc7 {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 624.290938] env[65918]: DEBUG nova.network.neutron [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Successfully updated port: 30b38db4-2d87-4551-a4fe-bc7427cb87d5 {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 624.307365] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Acquiring lock "refresh_cache-d34229ba-b110-41aa-b68f-e2d107fd817e" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 624.307365] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Acquired lock "refresh_cache-d34229ba-b110-41aa-b68f-e2d107fd817e" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 624.308870] env[65918]: DEBUG nova.network.neutron [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 624.501773] env[65918]: DEBUG nova.network.neutron [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 625.541762] env[65918]: DEBUG nova.network.neutron [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Successfully created port: 518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8 {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 625.865140] env[65918]: DEBUG nova.network.neutron [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Updating instance_info_cache with network_info: [{"id": "30b38db4-2d87-4551-a4fe-bc7427cb87d5", "address": "fa:16:3e:14:b4:a5", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.83", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap30b38db4-2d", "ovs_interfaceid": "30b38db4-2d87-4551-a4fe-bc7427cb87d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 625.882839] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Releasing lock "refresh_cache-d34229ba-b110-41aa-b68f-e2d107fd817e" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 625.883318] env[65918]: DEBUG nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Instance network_info: |[{"id": "30b38db4-2d87-4551-a4fe-bc7427cb87d5", "address": "fa:16:3e:14:b4:a5", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.83", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap30b38db4-2d", "ovs_interfaceid": "30b38db4-2d87-4551-a4fe-bc7427cb87d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 625.884294] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:14:b4:a5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c0d5204b-f60e-4830-84c8-2fe246c28202', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '30b38db4-2d87-4551-a4fe-bc7427cb87d5', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 625.899533] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Creating folder: Project (f56a79d560ba41a09b75d24eb13e2470). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 625.900753] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9d82adde-1309-4645-a458-4a7b55f1b42c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.916145] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Created folder: Project (f56a79d560ba41a09b75d24eb13e2470) in parent group-v572679. [ 625.916145] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Creating folder: Instances. Parent ref: group-v572683. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 625.916145] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e490e224-fbd5-4324-a30f-cb22a6afd116 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.924313] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Created folder: Instances in parent group-v572683. [ 625.924556] env[65918]: DEBUG oslo.service.loopingcall [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 625.924819] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 625.925094] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f5767307-7ade-4233-918c-a72d2bf8e8bc {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.959393] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 625.959393] env[65918]: value = "task-2848136" [ 625.959393] env[65918]: _type = "Task" [ 625.959393] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 625.972658] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848136, 'name': CreateVM_Task} progress is 5%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 626.473147] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848136, 'name': CreateVM_Task, 'duration_secs': 0.325627} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 626.473147] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 626.492186] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 626.492361] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 626.492949] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 626.493229] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-473889e7-2e20-409d-adab-53d88248490e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.498885] env[65918]: DEBUG oslo_vmware.api [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Waiting for the task: (returnval){ [ 626.498885] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52d53df8-da52-ac74-69e0-ef5205cbed35" [ 626.498885] env[65918]: _type = "Task" [ 626.498885] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 626.509048] env[65918]: DEBUG oslo_vmware.api [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52d53df8-da52-ac74-69e0-ef5205cbed35, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 626.546530] env[65918]: DEBUG nova.compute.manager [req-0d74eba1-1217-450b-92e9-fb5259ded2c4 req-d490d9ed-c6ae-4e42-8250-57fc2c0724c5 service nova] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Received event network-vif-plugged-30b38db4-2d87-4551-a4fe-bc7427cb87d5 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 626.546791] env[65918]: DEBUG oslo_concurrency.lockutils [req-0d74eba1-1217-450b-92e9-fb5259ded2c4 req-d490d9ed-c6ae-4e42-8250-57fc2c0724c5 service nova] Acquiring lock "d34229ba-b110-41aa-b68f-e2d107fd817e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.546968] env[65918]: DEBUG oslo_concurrency.lockutils [req-0d74eba1-1217-450b-92e9-fb5259ded2c4 req-d490d9ed-c6ae-4e42-8250-57fc2c0724c5 service nova] Lock "d34229ba-b110-41aa-b68f-e2d107fd817e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.548699] env[65918]: DEBUG oslo_concurrency.lockutils [req-0d74eba1-1217-450b-92e9-fb5259ded2c4 req-d490d9ed-c6ae-4e42-8250-57fc2c0724c5 service nova] Lock "d34229ba-b110-41aa-b68f-e2d107fd817e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.548894] env[65918]: DEBUG nova.compute.manager [req-0d74eba1-1217-450b-92e9-fb5259ded2c4 req-d490d9ed-c6ae-4e42-8250-57fc2c0724c5 service nova] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] No waiting events found dispatching network-vif-plugged-30b38db4-2d87-4551-a4fe-bc7427cb87d5 {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 626.549077] env[65918]: WARNING nova.compute.manager [req-0d74eba1-1217-450b-92e9-fb5259ded2c4 req-d490d9ed-c6ae-4e42-8250-57fc2c0724c5 service nova] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Received unexpected event network-vif-plugged-30b38db4-2d87-4551-a4fe-bc7427cb87d5 for instance with vm_state building and task_state spawning. [ 627.014206] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 627.014693] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 627.014693] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.031319] env[65918]: DEBUG nova.network.neutron [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Successfully updated port: ca75e443-247a-4fe0-b608-39335e73fbb4 {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 627.047538] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Acquiring lock "refresh_cache-169c3642-1229-4c49-9e04-67e4e1764286" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.047538] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Acquired lock "refresh_cache-169c3642-1229-4c49-9e04-67e4e1764286" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 627.047538] env[65918]: DEBUG nova.network.neutron [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 627.273025] env[65918]: DEBUG nova.network.neutron [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 628.085849] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquiring lock "5c83d7da-f63b-40b7-a1aa-916ba9343439" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 628.086334] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Lock "5c83d7da-f63b-40b7-a1aa-916ba9343439" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 628.099690] env[65918]: DEBUG nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 628.159189] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 628.159368] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 628.161746] env[65918]: INFO nova.compute.claims [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 628.337389] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61a0d51a-c349-4a97-beef-d69b710cbe07 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.347711] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc189b94-341a-4278-ad90-df143a985ce9 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.383103] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3a9a8f3-74fa-4648-94b5-6ffc08f8ebc7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.392085] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6a32115-4e39-4185-b2ca-d69bb3cbe834 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.411657] env[65918]: DEBUG nova.compute.provider_tree [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 628.429193] env[65918]: DEBUG nova.scheduler.client.report [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 628.453277] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 628.453587] env[65918]: DEBUG nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 628.508325] env[65918]: DEBUG nova.compute.utils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 628.515174] env[65918]: DEBUG nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 628.515174] env[65918]: DEBUG nova.network.neutron [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 628.528954] env[65918]: DEBUG nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 628.633804] env[65918]: DEBUG nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 628.642301] env[65918]: DEBUG nova.network.neutron [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Updating instance_info_cache with network_info: [{"id": "ca75e443-247a-4fe0-b608-39335e73fbb4", "address": "fa:16:3e:95:8e:f2", "network": {"id": "0c28a7e0-4406-435a-8c52-85c8a4da3b35", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-597026994-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ffe525e4ef6f4d70afff3e304a002bea", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "53ebf5df-5ecb-4a0c-a163-d88165639de0", "external-id": "nsx-vlan-transportzone-588", "segmentation_id": 588, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapca75e443-24", "ovs_interfaceid": "ca75e443-247a-4fe0-b608-39335e73fbb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 628.651057] env[65918]: DEBUG nova.compute.manager [req-136e8370-9055-4cdf-a0a2-6128381320fb req-e0c14c9e-ad5d-45aa-ad7b-b3ff9b77a0ab service nova] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Received event network-vif-plugged-ca75e443-247a-4fe0-b608-39335e73fbb4 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 628.651057] env[65918]: DEBUG oslo_concurrency.lockutils [req-136e8370-9055-4cdf-a0a2-6128381320fb req-e0c14c9e-ad5d-45aa-ad7b-b3ff9b77a0ab service nova] Acquiring lock "169c3642-1229-4c49-9e04-67e4e1764286-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 628.651057] env[65918]: DEBUG oslo_concurrency.lockutils [req-136e8370-9055-4cdf-a0a2-6128381320fb req-e0c14c9e-ad5d-45aa-ad7b-b3ff9b77a0ab service nova] Lock "169c3642-1229-4c49-9e04-67e4e1764286-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 628.655224] env[65918]: DEBUG oslo_concurrency.lockutils [req-136e8370-9055-4cdf-a0a2-6128381320fb req-e0c14c9e-ad5d-45aa-ad7b-b3ff9b77a0ab service nova] Lock "169c3642-1229-4c49-9e04-67e4e1764286-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 628.655224] env[65918]: DEBUG nova.compute.manager [req-136e8370-9055-4cdf-a0a2-6128381320fb req-e0c14c9e-ad5d-45aa-ad7b-b3ff9b77a0ab service nova] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] No waiting events found dispatching network-vif-plugged-ca75e443-247a-4fe0-b608-39335e73fbb4 {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 628.655224] env[65918]: WARNING nova.compute.manager [req-136e8370-9055-4cdf-a0a2-6128381320fb req-e0c14c9e-ad5d-45aa-ad7b-b3ff9b77a0ab service nova] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Received unexpected event network-vif-plugged-ca75e443-247a-4fe0-b608-39335e73fbb4 for instance with vm_state building and task_state spawning. [ 628.673108] env[65918]: DEBUG nova.virt.hardware [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 628.673108] env[65918]: DEBUG nova.virt.hardware [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 628.673108] env[65918]: DEBUG nova.virt.hardware [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 628.673397] env[65918]: DEBUG nova.virt.hardware [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 628.673397] env[65918]: DEBUG nova.virt.hardware [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 628.680339] env[65918]: DEBUG nova.virt.hardware [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 628.680339] env[65918]: DEBUG nova.virt.hardware [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 628.680339] env[65918]: DEBUG nova.virt.hardware [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 628.680339] env[65918]: DEBUG nova.virt.hardware [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 628.680339] env[65918]: DEBUG nova.virt.hardware [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 628.680659] env[65918]: DEBUG nova.virt.hardware [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 628.684286] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5bad0ae-d5c7-432a-a271-60a10a93ccfb {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.692524] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Releasing lock "refresh_cache-169c3642-1229-4c49-9e04-67e4e1764286" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 628.692989] env[65918]: DEBUG nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Instance network_info: |[{"id": "ca75e443-247a-4fe0-b608-39335e73fbb4", "address": "fa:16:3e:95:8e:f2", "network": {"id": "0c28a7e0-4406-435a-8c52-85c8a4da3b35", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-597026994-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ffe525e4ef6f4d70afff3e304a002bea", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "53ebf5df-5ecb-4a0c-a163-d88165639de0", "external-id": "nsx-vlan-transportzone-588", "segmentation_id": 588, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapca75e443-24", "ovs_interfaceid": "ca75e443-247a-4fe0-b608-39335e73fbb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 628.693942] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:95:8e:f2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '53ebf5df-5ecb-4a0c-a163-d88165639de0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ca75e443-247a-4fe0-b608-39335e73fbb4', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 628.707816] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Creating folder: Project (ffe525e4ef6f4d70afff3e304a002bea). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 628.709300] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c43d41b3-d457-4cf9-ac9a-bc664ce97ff3 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.719071] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8a02926-c318-4860-bc2d-e72cae4fa3ce {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.736915] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Created folder: Project (ffe525e4ef6f4d70afff3e304a002bea) in parent group-v572679. [ 628.737112] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Creating folder: Instances. Parent ref: group-v572686. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 628.737405] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e7797c7f-6594-4808-8f0a-cdd2d5068d45 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.746446] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Created folder: Instances in parent group-v572686. [ 628.746677] env[65918]: DEBUG oslo.service.loopingcall [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 628.746857] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 628.747056] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5d016a43-7914-4d5d-91c7-0cb94b1a4138 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.764434] env[65918]: DEBUG nova.policy [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '77fa3b5aa8fe471480344bdb073149aa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33301c2fb41942968bbfec91576d4822', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 628.770187] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 628.770187] env[65918]: value = "task-2848139" [ 628.770187] env[65918]: _type = "Task" [ 628.770187] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 628.783677] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquiring lock "bd0158bd-e255-4680-b00e-81eb1ce88ad5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 628.783843] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "bd0158bd-e255-4680-b00e-81eb1ce88ad5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 628.785629] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848139, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 628.794558] env[65918]: DEBUG nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 628.862390] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 628.862390] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 628.865826] env[65918]: INFO nova.compute.claims [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 628.873609] env[65918]: DEBUG nova.network.neutron [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Successfully updated port: 2fc1c238-d490-4df5-8812-fb2be8ea6fc7 {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 628.885115] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Acquiring lock "refresh_cache-cf0087c7-22d0-4317-a00a-73967ccafeaa" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 628.885115] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Acquired lock "refresh_cache-cf0087c7-22d0-4317-a00a-73967ccafeaa" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 628.885484] env[65918]: DEBUG nova.network.neutron [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 629.065543] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab24d39b-3c36-46a9-84a9-a8247630a089 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.075237] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d8d1693-a916-49fa-baf9-fa35ceb8a777 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.109177] env[65918]: DEBUG nova.network.neutron [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 629.111559] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67a86ddd-9630-4762-90ac-ddcc81d0240c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.119870] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-543e4e94-7d8c-4d62-8567-7b7b95385cab {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.135242] env[65918]: DEBUG nova.compute.provider_tree [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 629.145347] env[65918]: DEBUG nova.scheduler.client.report [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 629.167142] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.167142] env[65918]: DEBUG nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 629.225822] env[65918]: DEBUG nova.compute.utils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 629.227109] env[65918]: DEBUG nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 629.227289] env[65918]: DEBUG nova.network.neutron [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 629.255937] env[65918]: DEBUG nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 629.284788] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848139, 'name': CreateVM_Task, 'duration_secs': 0.302196} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 629.284788] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 629.285292] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 629.285460] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 629.285810] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 629.286327] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4a1c3ba7-43b5-4432-8e78-d3e37f3dadab {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.291885] env[65918]: DEBUG oslo_vmware.api [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Waiting for the task: (returnval){ [ 629.291885] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]5235b939-36f1-a89a-bfbb-b08349644c5d" [ 629.291885] env[65918]: _type = "Task" [ 629.291885] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 629.305125] env[65918]: DEBUG oslo_vmware.api [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]5235b939-36f1-a89a-bfbb-b08349644c5d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 629.357806] env[65918]: DEBUG nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 629.386734] env[65918]: DEBUG nova.virt.hardware [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 629.390020] env[65918]: DEBUG nova.virt.hardware [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 629.390020] env[65918]: DEBUG nova.virt.hardware [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 629.390020] env[65918]: DEBUG nova.virt.hardware [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 629.390020] env[65918]: DEBUG nova.virt.hardware [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 629.390020] env[65918]: DEBUG nova.virt.hardware [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 629.390222] env[65918]: DEBUG nova.virt.hardware [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 629.390222] env[65918]: DEBUG nova.virt.hardware [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 629.390222] env[65918]: DEBUG nova.virt.hardware [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 629.390222] env[65918]: DEBUG nova.virt.hardware [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 629.390222] env[65918]: DEBUG nova.virt.hardware [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 629.390407] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4559cebd-71b0-4548-99e2-acab8170c9ac {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.399601] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f70aab6-8feb-4fd7-a8f1-bf3fca2fbac9 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 629.749581] env[65918]: DEBUG nova.policy [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd03dde4e89a94abe86e72cefd96eda08', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'aa2e2978302b4e76b809fddcba0eab40', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 629.805881] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 629.806190] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 629.806375] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 630.302677] env[65918]: DEBUG nova.network.neutron [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Successfully updated port: 518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8 {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 630.331653] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquiring lock "refresh_cache-7bc8087e-17e1-4cbd-84be-bd6c07e104ce" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 630.331942] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquired lock "refresh_cache-7bc8087e-17e1-4cbd-84be-bd6c07e104ce" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 630.331942] env[65918]: DEBUG nova.network.neutron [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 630.432614] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.433044] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.434092] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Starting heal instance info cache {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 630.434092] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Rebuilding the list of instances to heal {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 630.455044] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.455254] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.455394] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.455523] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.456511] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.456895] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.458029] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 630.458029] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Didn't find any instances for network info cache update. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 630.458029] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.458173] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.458366] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.458697] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.458973] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.459245] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.459474] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65918) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 630.460360] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager.update_available_resource {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 630.472273] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.472644] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 630.472644] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 630.472895] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65918) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 630.473850] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7432bffd-7abf-4573-9346-7469dbf93bbe {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.485965] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c95e89f-09da-4f5d-8dcf-b0f994f923b6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.505032] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03404510-9342-4b4c-9c55-816f6e7fabe7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.513669] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c9aba64-58be-4969-a535-252816b27add {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.555637] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181073MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65918) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 630.558480] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.558480] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.002s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 630.629128] env[65918]: DEBUG nova.network.neutron [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Updating instance_info_cache with network_info: [{"id": "2fc1c238-d490-4df5-8812-fb2be8ea6fc7", "address": "fa:16:3e:fd:fb:27", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.156", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2fc1c238-d4", "ovs_interfaceid": "2fc1c238-d490-4df5-8812-fb2be8ea6fc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 630.647847] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 2efc86dd-575c-4d78-a5ca-592f077655de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.648018] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance d34229ba-b110-41aa-b68f-e2d107fd817e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.648160] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 169c3642-1229-4c49-9e04-67e4e1764286 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.648284] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance cf0087c7-22d0-4317-a00a-73967ccafeaa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.648407] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 7bc8087e-17e1-4cbd-84be-bd6c07e104ce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.649190] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 5c83d7da-f63b-40b7-a1aa-916ba9343439 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.649190] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bd0158bd-e255-4680-b00e-81eb1ce88ad5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 630.651909] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Acquiring lock "51163f89-c8b6-48a8-bbbe-de63c44d92a5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.651909] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Lock "51163f89-c8b6-48a8-bbbe-de63c44d92a5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 630.653260] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Releasing lock "refresh_cache-cf0087c7-22d0-4317-a00a-73967ccafeaa" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 630.653472] env[65918]: DEBUG nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Instance network_info: |[{"id": "2fc1c238-d490-4df5-8812-fb2be8ea6fc7", "address": "fa:16:3e:fd:fb:27", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.156", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2fc1c238-d4", "ovs_interfaceid": "2fc1c238-d490-4df5-8812-fb2be8ea6fc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 630.654582] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fd:fb:27', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c0d5204b-f60e-4830-84c8-2fe246c28202', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2fc1c238-d490-4df5-8812-fb2be8ea6fc7', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 630.665965] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Creating folder: Project (b3e200cfc29d449a8025e989582f916b). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 630.669817] env[65918]: DEBUG nova.network.neutron [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 630.670636] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3586cf20-dc33-4cd0-bfee-ad55ed91ea0f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.674850] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 51163f89-c8b6-48a8-bbbe-de63c44d92a5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 630.675087] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 630.675239] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 630.684645] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Created folder: Project (b3e200cfc29d449a8025e989582f916b) in parent group-v572679. [ 630.684828] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Creating folder: Instances. Parent ref: group-v572689. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 630.685065] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f34ce7fc-ab1c-46ec-80ee-49c7245659b6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.699509] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Created folder: Instances in parent group-v572689. [ 630.700122] env[65918]: DEBUG oslo.service.loopingcall [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 630.700519] env[65918]: DEBUG nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 630.706915] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 630.707640] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e312f29a-745b-48eb-bcf7-60ee972b6881 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.733420] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 630.733420] env[65918]: value = "task-2848142" [ 630.733420] env[65918]: _type = "Task" [ 630.733420] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 630.742295] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848142, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 630.775203] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.852854] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74cd145d-a4ec-4f48-b23c-072fc91ed73a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.859091] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-774442ca-64d8-4c89-a5c5-ce29fe175be6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.894488] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25cf4070-3ec5-4a1b-bc23-fd902083e565 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.903442] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a758ee48-b36d-4355-8fc8-fb9fa4af1224 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 630.918155] env[65918]: DEBUG nova.compute.provider_tree [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 630.928096] env[65918]: DEBUG nova.scheduler.client.report [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 630.945626] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65918) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 630.945823] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.388s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 630.947140] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.171s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 630.947602] env[65918]: INFO nova.compute.claims [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 631.070641] env[65918]: DEBUG nova.compute.manager [req-c5569258-64f8-461d-9488-d331ded8c2f7 req-2e6d3026-5891-4e10-b202-c2dc63ae5ebe service nova] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Received event network-changed-30b38db4-2d87-4551-a4fe-bc7427cb87d5 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 631.070739] env[65918]: DEBUG nova.compute.manager [req-c5569258-64f8-461d-9488-d331ded8c2f7 req-2e6d3026-5891-4e10-b202-c2dc63ae5ebe service nova] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Refreshing instance network info cache due to event network-changed-30b38db4-2d87-4551-a4fe-bc7427cb87d5. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 631.070951] env[65918]: DEBUG oslo_concurrency.lockutils [req-c5569258-64f8-461d-9488-d331ded8c2f7 req-2e6d3026-5891-4e10-b202-c2dc63ae5ebe service nova] Acquiring lock "refresh_cache-d34229ba-b110-41aa-b68f-e2d107fd817e" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 631.071100] env[65918]: DEBUG oslo_concurrency.lockutils [req-c5569258-64f8-461d-9488-d331ded8c2f7 req-2e6d3026-5891-4e10-b202-c2dc63ae5ebe service nova] Acquired lock "refresh_cache-d34229ba-b110-41aa-b68f-e2d107fd817e" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 631.071254] env[65918]: DEBUG nova.network.neutron [req-c5569258-64f8-461d-9488-d331ded8c2f7 req-2e6d3026-5891-4e10-b202-c2dc63ae5ebe service nova] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Refreshing network info cache for port 30b38db4-2d87-4551-a4fe-bc7427cb87d5 {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 631.218615] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Acquiring lock "32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.218841] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Lock "32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.248921] env[65918]: DEBUG nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 631.266896] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848142, 'name': CreateVM_Task, 'duration_secs': 0.297094} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 631.270349] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 631.271250] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 631.271403] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 631.271940] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 631.272233] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-610bc64b-2a13-4d34-a812-e7be878b5953 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.277489] env[65918]: DEBUG oslo_vmware.api [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Waiting for the task: (returnval){ [ 631.277489] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52a5ee0e-3e57-a67f-ae79-7bed87b2dc52" [ 631.277489] env[65918]: _type = "Task" [ 631.277489] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 631.294624] env[65918]: DEBUG oslo_vmware.api [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52a5ee0e-3e57-a67f-ae79-7bed87b2dc52, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 631.299730] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2287aef-503a-41a1-b640-bf5727b6c6d5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.315395] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c43ccbc9-1a86-430b-8635-8dccb8ea3413 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.358201] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.359444] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f455d0f-4ec9-4c3d-a69f-dc903785559f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.369744] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f142abb-1f70-4d05-aa8e-cbe83da28d99 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.385411] env[65918]: DEBUG nova.compute.provider_tree [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 631.396706] env[65918]: DEBUG nova.scheduler.client.report [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 631.416558] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.470s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.418029] env[65918]: DEBUG nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 631.420665] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.063s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.422189] env[65918]: INFO nova.compute.claims [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 631.463515] env[65918]: DEBUG nova.network.neutron [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Successfully created port: 76bbb4ff-1cd3-478e-9525-0704272c8ee3 {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 631.477660] env[65918]: DEBUG nova.compute.utils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 631.478930] env[65918]: DEBUG nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 631.480133] env[65918]: DEBUG nova.network.neutron [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 631.489654] env[65918]: DEBUG nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 631.587867] env[65918]: DEBUG nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 631.622883] env[65918]: DEBUG nova.virt.hardware [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 631.623135] env[65918]: DEBUG nova.virt.hardware [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 631.623302] env[65918]: DEBUG nova.virt.hardware [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 631.623536] env[65918]: DEBUG nova.virt.hardware [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 631.623624] env[65918]: DEBUG nova.virt.hardware [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 631.623764] env[65918]: DEBUG nova.virt.hardware [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 631.623961] env[65918]: DEBUG nova.virt.hardware [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 631.624127] env[65918]: DEBUG nova.virt.hardware [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 631.624292] env[65918]: DEBUG nova.virt.hardware [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 631.624440] env[65918]: DEBUG nova.virt.hardware [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 631.624603] env[65918]: DEBUG nova.virt.hardware [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 631.628324] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6e7ee79-8c62-4582-bca1-0429ec0468d6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.640946] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3704f9b8-a555-434a-a4c2-6b824891c6ec {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.704480] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c0907cb-4284-479f-9684-628b34e0524c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.713273] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-257d1c51-2806-4408-9768-581e264d17ca {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.725531] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquiring lock "934b5745-6c9a-4d21-92b8-7505a170e600" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.725798] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Lock "934b5745-6c9a-4d21-92b8-7505a170e600" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.755254] env[65918]: DEBUG nova.network.neutron [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Updating instance_info_cache with network_info: [{"id": "518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8", "address": "fa:16:3e:87:f2:5c", "network": {"id": "523aa2b5-3ddc-4fee-a844-03a1be40ff81", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-225218807-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "aa2e2978302b4e76b809fddcba0eab40", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap518f1e2f-c7", "ovs_interfaceid": "518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 631.759236] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cad889d-df73-4426-86c0-6b695050d5de {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.763185] env[65918]: DEBUG nova.compute.manager [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 631.773196] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07fd1550-41c4-46ac-814a-9968f3f10ea0 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.782075] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Releasing lock "refresh_cache-7bc8087e-17e1-4cbd-84be-bd6c07e104ce" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 631.782369] env[65918]: DEBUG nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Instance network_info: |[{"id": "518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8", "address": "fa:16:3e:87:f2:5c", "network": {"id": "523aa2b5-3ddc-4fee-a844-03a1be40ff81", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-225218807-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "aa2e2978302b4e76b809fddcba0eab40", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap518f1e2f-c7", "ovs_interfaceid": "518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 631.786709] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:87:f2:5c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3b67e519-46cf-44ce-b670-4ba4c0c5b658', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 631.794552] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Creating folder: Project (aa2e2978302b4e76b809fddcba0eab40). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 631.804850] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9df7ffab-afc9-4052-bcc2-2c2c05a114e5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.807280] env[65918]: DEBUG nova.compute.provider_tree [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 631.817466] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 631.817700] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 631.817900] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 631.819411] env[65918]: DEBUG nova.scheduler.client.report [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 631.826021] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Created folder: Project (aa2e2978302b4e76b809fddcba0eab40) in parent group-v572679. [ 631.826021] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Creating folder: Instances. Parent ref: group-v572692. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 631.826021] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b21f7c9c-0498-407b-a30a-6d11c512749d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.839324] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Created folder: Instances in parent group-v572692. [ 631.839602] env[65918]: DEBUG oslo.service.loopingcall [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 631.840122] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 631.840229] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-eb817ef0-366d-47cd-b395-315ae3d57c36 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.860336] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.861593] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.441s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.862106] env[65918]: DEBUG nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 631.864863] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.005s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.866248] env[65918]: INFO nova.compute.claims [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 631.869806] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 631.869806] env[65918]: value = "task-2848145" [ 631.869806] env[65918]: _type = "Task" [ 631.869806] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 631.878615] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848145, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 631.933552] env[65918]: DEBUG nova.compute.utils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 631.939772] env[65918]: DEBUG nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 631.940163] env[65918]: DEBUG nova.network.neutron [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 631.960974] env[65918]: DEBUG nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 632.061963] env[65918]: DEBUG nova.policy [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8038715deff747fe81fef41297c5830d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71110178a1894ffe9c88961a1261fbe0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 632.099641] env[65918]: DEBUG nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 632.139445] env[65918]: DEBUG nova.virt.hardware [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 632.139698] env[65918]: DEBUG nova.virt.hardware [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 632.139850] env[65918]: DEBUG nova.virt.hardware [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 632.140156] env[65918]: DEBUG nova.virt.hardware [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 632.140334] env[65918]: DEBUG nova.virt.hardware [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 632.140479] env[65918]: DEBUG nova.virt.hardware [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 632.140686] env[65918]: DEBUG nova.virt.hardware [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 632.140839] env[65918]: DEBUG nova.virt.hardware [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 632.140998] env[65918]: DEBUG nova.virt.hardware [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 632.141276] env[65918]: DEBUG nova.virt.hardware [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 632.141486] env[65918]: DEBUG nova.virt.hardware [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 632.146418] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6955116-88ba-49fb-87fa-6b66041b42d2 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.160904] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-552117e6-bdff-4390-947b-755aaa7a34d2 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.217900] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d29e3dd-0e39-4361-8756-428983758e33 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.226585] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a12eaa3-5c73-42b9-8725-46088c33a1bc {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.269323] env[65918]: DEBUG nova.network.neutron [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Successfully created port: 505919bd-2b6d-4cb7-8361-c62802ec1625 {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 632.271810] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f10f22b4-0d8e-4ba2-82e6-765266c88db8 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.280924] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3feb1214-f4a6-42a4-aba2-24953c4c3ec6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.297173] env[65918]: DEBUG nova.compute.provider_tree [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 632.308913] env[65918]: DEBUG nova.scheduler.client.report [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 632.326174] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.461s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 632.326692] env[65918]: DEBUG nova.compute.manager [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 632.365644] env[65918]: DEBUG nova.compute.utils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 632.367683] env[65918]: DEBUG nova.compute.manager [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Not allocating networking since 'none' was specified. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 632.382351] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848145, 'name': CreateVM_Task, 'duration_secs': 0.284668} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 632.382515] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 632.383195] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 632.383347] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 632.383653] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 632.383919] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c192d95f-c2a8-4bf8-8089-97312c3115a4 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.387262] env[65918]: DEBUG nova.compute.manager [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 632.393892] env[65918]: DEBUG oslo_vmware.api [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Waiting for the task: (returnval){ [ 632.393892] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]5200cf59-b97c-ad4a-c69e-5e87d5d78155" [ 632.393892] env[65918]: _type = "Task" [ 632.393892] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 632.404912] env[65918]: DEBUG oslo_vmware.api [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]5200cf59-b97c-ad4a-c69e-5e87d5d78155, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 632.474043] env[65918]: DEBUG nova.compute.manager [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 632.502389] env[65918]: DEBUG nova.virt.hardware [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 632.502508] env[65918]: DEBUG nova.virt.hardware [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 632.502784] env[65918]: DEBUG nova.virt.hardware [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 632.502784] env[65918]: DEBUG nova.virt.hardware [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 632.502908] env[65918]: DEBUG nova.virt.hardware [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 632.503539] env[65918]: DEBUG nova.virt.hardware [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 632.503539] env[65918]: DEBUG nova.virt.hardware [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 632.503539] env[65918]: DEBUG nova.virt.hardware [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 632.503539] env[65918]: DEBUG nova.virt.hardware [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 632.503809] env[65918]: DEBUG nova.virt.hardware [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 632.503849] env[65918]: DEBUG nova.virt.hardware [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 632.504720] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0f0e289-dff1-4be8-a6e0-3c76dafec20c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.512947] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0ded299-d8c6-4719-827e-d3200b055915 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.527279] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Instance VIF info [] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 632.535334] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Creating folder: Project (cc40efc72c124840bcea7387dfb0b605). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 632.535334] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4096be30-4cda-4a70-a01c-3454fa221077 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.543727] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Created folder: Project (cc40efc72c124840bcea7387dfb0b605) in parent group-v572679. [ 632.543925] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Creating folder: Instances. Parent ref: group-v572695. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 632.544163] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d6946f52-8357-4242-817a-bc25ead7ff13 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.552961] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Created folder: Instances in parent group-v572695. [ 632.552961] env[65918]: DEBUG oslo.service.loopingcall [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 632.552961] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 632.553174] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d4c0d9af-306b-4699-8758-052e91ec1c28 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.570332] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 632.570332] env[65918]: value = "task-2848148" [ 632.570332] env[65918]: _type = "Task" [ 632.570332] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 632.577760] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848148, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 632.642873] env[65918]: DEBUG nova.policy [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6af531e3eb5b48a0ae3cde02914a1541', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c9e9e67e7dca4ca6b7485c853433db5d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 632.910898] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 632.911583] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 632.912046] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 632.995740] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Acquiring lock "bba6f3d9-1be3-4048-86d5-f435511b0fc0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 632.999023] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Lock "bba6f3d9-1be3-4048-86d5-f435511b0fc0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.090741] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848148, 'name': CreateVM_Task, 'duration_secs': 0.276841} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 633.090911] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 633.091355] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 633.091549] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 633.091904] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 633.092159] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-239071f1-a60a-4ced-985e-00b30aa385a7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.097189] env[65918]: DEBUG oslo_vmware.api [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Waiting for the task: (returnval){ [ 633.097189] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52402581-f88e-f504-cd0c-9c3b096ddec8" [ 633.097189] env[65918]: _type = "Task" [ 633.097189] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 633.113018] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 633.118365] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 633.118606] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 633.341781] env[65918]: DEBUG nova.compute.manager [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Received event network-changed-ca75e443-247a-4fe0-b608-39335e73fbb4 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 633.342021] env[65918]: DEBUG nova.compute.manager [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Refreshing instance network info cache due to event network-changed-ca75e443-247a-4fe0-b608-39335e73fbb4. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 633.342204] env[65918]: DEBUG oslo_concurrency.lockutils [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] Acquiring lock "refresh_cache-169c3642-1229-4c49-9e04-67e4e1764286" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 633.342356] env[65918]: DEBUG oslo_concurrency.lockutils [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] Acquired lock "refresh_cache-169c3642-1229-4c49-9e04-67e4e1764286" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 633.342520] env[65918]: DEBUG nova.network.neutron [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Refreshing network info cache for port ca75e443-247a-4fe0-b608-39335e73fbb4 {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 633.507212] env[65918]: DEBUG nova.network.neutron [req-c5569258-64f8-461d-9488-d331ded8c2f7 req-2e6d3026-5891-4e10-b202-c2dc63ae5ebe service nova] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Updated VIF entry in instance network info cache for port 30b38db4-2d87-4551-a4fe-bc7427cb87d5. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 633.507922] env[65918]: DEBUG nova.network.neutron [req-c5569258-64f8-461d-9488-d331ded8c2f7 req-2e6d3026-5891-4e10-b202-c2dc63ae5ebe service nova] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Updating instance_info_cache with network_info: [{"id": "30b38db4-2d87-4551-a4fe-bc7427cb87d5", "address": "fa:16:3e:14:b4:a5", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.83", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap30b38db4-2d", "ovs_interfaceid": "30b38db4-2d87-4551-a4fe-bc7427cb87d5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.526623] env[65918]: DEBUG oslo_concurrency.lockutils [req-c5569258-64f8-461d-9488-d331ded8c2f7 req-2e6d3026-5891-4e10-b202-c2dc63ae5ebe service nova] Releasing lock "refresh_cache-d34229ba-b110-41aa-b68f-e2d107fd817e" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 634.019134] env[65918]: DEBUG nova.network.neutron [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Successfully created port: 529a38dc-110d-42e4-831e-b26a9d87491d {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 634.660906] env[65918]: DEBUG nova.network.neutron [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Updated VIF entry in instance network info cache for port ca75e443-247a-4fe0-b608-39335e73fbb4. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 634.661276] env[65918]: DEBUG nova.network.neutron [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Updating instance_info_cache with network_info: [{"id": "ca75e443-247a-4fe0-b608-39335e73fbb4", "address": "fa:16:3e:95:8e:f2", "network": {"id": "0c28a7e0-4406-435a-8c52-85c8a4da3b35", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-597026994-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ffe525e4ef6f4d70afff3e304a002bea", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "53ebf5df-5ecb-4a0c-a163-d88165639de0", "external-id": "nsx-vlan-transportzone-588", "segmentation_id": 588, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapca75e443-24", "ovs_interfaceid": "ca75e443-247a-4fe0-b608-39335e73fbb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 634.672223] env[65918]: DEBUG oslo_concurrency.lockutils [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] Releasing lock "refresh_cache-169c3642-1229-4c49-9e04-67e4e1764286" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 634.672482] env[65918]: DEBUG nova.compute.manager [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Received event network-vif-plugged-2fc1c238-d490-4df5-8812-fb2be8ea6fc7 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 634.672678] env[65918]: DEBUG oslo_concurrency.lockutils [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] Acquiring lock "cf0087c7-22d0-4317-a00a-73967ccafeaa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.672876] env[65918]: DEBUG oslo_concurrency.lockutils [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] Lock "cf0087c7-22d0-4317-a00a-73967ccafeaa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.673055] env[65918]: DEBUG oslo_concurrency.lockutils [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] Lock "cf0087c7-22d0-4317-a00a-73967ccafeaa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.673228] env[65918]: DEBUG nova.compute.manager [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] No waiting events found dispatching network-vif-plugged-2fc1c238-d490-4df5-8812-fb2be8ea6fc7 {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 634.673394] env[65918]: WARNING nova.compute.manager [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Received unexpected event network-vif-plugged-2fc1c238-d490-4df5-8812-fb2be8ea6fc7 for instance with vm_state building and task_state spawning. [ 634.673698] env[65918]: DEBUG nova.compute.manager [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Received event network-changed-2fc1c238-d490-4df5-8812-fb2be8ea6fc7 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 634.673833] env[65918]: DEBUG nova.compute.manager [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Refreshing instance network info cache due to event network-changed-2fc1c238-d490-4df5-8812-fb2be8ea6fc7. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 634.673912] env[65918]: DEBUG oslo_concurrency.lockutils [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] Acquiring lock "refresh_cache-cf0087c7-22d0-4317-a00a-73967ccafeaa" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 634.674057] env[65918]: DEBUG oslo_concurrency.lockutils [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] Acquired lock "refresh_cache-cf0087c7-22d0-4317-a00a-73967ccafeaa" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 634.674213] env[65918]: DEBUG nova.network.neutron [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Refreshing network info cache for port 2fc1c238-d490-4df5-8812-fb2be8ea6fc7 {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 634.882195] env[65918]: DEBUG nova.network.neutron [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Successfully updated port: 76bbb4ff-1cd3-478e-9525-0704272c8ee3 {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 634.895111] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquiring lock "refresh_cache-5c83d7da-f63b-40b7-a1aa-916ba9343439" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 634.896187] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquired lock "refresh_cache-5c83d7da-f63b-40b7-a1aa-916ba9343439" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 634.896716] env[65918]: DEBUG nova.network.neutron [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 634.974428] env[65918]: DEBUG nova.network.neutron [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Successfully created port: 7fa47b80-ce9c-4251-8196-8939b09283ef {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 635.042329] env[65918]: DEBUG nova.network.neutron [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 635.603934] env[65918]: DEBUG nova.network.neutron [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Successfully updated port: 505919bd-2b6d-4cb7-8361-c62802ec1625 {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 635.615600] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquiring lock "refresh_cache-bd0158bd-e255-4680-b00e-81eb1ce88ad5" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 635.617737] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquired lock "refresh_cache-bd0158bd-e255-4680-b00e-81eb1ce88ad5" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 635.618874] env[65918]: DEBUG nova.network.neutron [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 635.905124] env[65918]: DEBUG nova.network.neutron [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 635.922462] env[65918]: DEBUG nova.network.neutron [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Updated VIF entry in instance network info cache for port 2fc1c238-d490-4df5-8812-fb2be8ea6fc7. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 635.922462] env[65918]: DEBUG nova.network.neutron [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Updating instance_info_cache with network_info: [{"id": "2fc1c238-d490-4df5-8812-fb2be8ea6fc7", "address": "fa:16:3e:fd:fb:27", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.156", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2fc1c238-d4", "ovs_interfaceid": "2fc1c238-d490-4df5-8812-fb2be8ea6fc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.946257] env[65918]: DEBUG nova.network.neutron [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Updating instance_info_cache with network_info: [{"id": "76bbb4ff-1cd3-478e-9525-0704272c8ee3", "address": "fa:16:3e:66:6f:bd", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap76bbb4ff-1c", "ovs_interfaceid": "76bbb4ff-1cd3-478e-9525-0704272c8ee3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.947301] env[65918]: DEBUG oslo_concurrency.lockutils [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] Releasing lock "refresh_cache-cf0087c7-22d0-4317-a00a-73967ccafeaa" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 635.947545] env[65918]: DEBUG nova.compute.manager [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Received event network-vif-plugged-518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 635.948064] env[65918]: DEBUG oslo_concurrency.lockutils [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] Acquiring lock "7bc8087e-17e1-4cbd-84be-bd6c07e104ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.948064] env[65918]: DEBUG oslo_concurrency.lockutils [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] Lock "7bc8087e-17e1-4cbd-84be-bd6c07e104ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.948198] env[65918]: DEBUG oslo_concurrency.lockutils [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] Lock "7bc8087e-17e1-4cbd-84be-bd6c07e104ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.948273] env[65918]: DEBUG nova.compute.manager [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] No waiting events found dispatching network-vif-plugged-518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8 {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 635.948434] env[65918]: WARNING nova.compute.manager [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Received unexpected event network-vif-plugged-518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8 for instance with vm_state building and task_state spawning. [ 635.948596] env[65918]: DEBUG nova.compute.manager [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Received event network-changed-518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 635.948742] env[65918]: DEBUG nova.compute.manager [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Refreshing instance network info cache due to event network-changed-518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 635.948945] env[65918]: DEBUG oslo_concurrency.lockutils [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] Acquiring lock "refresh_cache-7bc8087e-17e1-4cbd-84be-bd6c07e104ce" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 635.949107] env[65918]: DEBUG oslo_concurrency.lockutils [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] Acquired lock "refresh_cache-7bc8087e-17e1-4cbd-84be-bd6c07e104ce" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 635.949565] env[65918]: DEBUG nova.network.neutron [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Refreshing network info cache for port 518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8 {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 635.962952] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Releasing lock "refresh_cache-5c83d7da-f63b-40b7-a1aa-916ba9343439" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 635.963281] env[65918]: DEBUG nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Instance network_info: |[{"id": "76bbb4ff-1cd3-478e-9525-0704272c8ee3", "address": "fa:16:3e:66:6f:bd", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap76bbb4ff-1c", "ovs_interfaceid": "76bbb4ff-1cd3-478e-9525-0704272c8ee3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 635.963803] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:66:6f:bd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c0d5204b-f60e-4830-84c8-2fe246c28202', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '76bbb4ff-1cd3-478e-9525-0704272c8ee3', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 635.972881] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Creating folder: Project (33301c2fb41942968bbfec91576d4822). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 635.973752] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-654ab95a-808e-44e8-80ee-8cf4c660b325 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.986844] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Created folder: Project (33301c2fb41942968bbfec91576d4822) in parent group-v572679. [ 635.987109] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Creating folder: Instances. Parent ref: group-v572698. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 635.987344] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4c850ef1-a35a-454a-b92f-e7e08036dd77 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.995798] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Created folder: Instances in parent group-v572698. [ 635.996047] env[65918]: DEBUG oslo.service.loopingcall [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 635.996937] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 635.997035] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ecb1a38e-8d3b-4ec8-bc34-76894360e037 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.019789] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 636.019789] env[65918]: value = "task-2848151" [ 636.019789] env[65918]: _type = "Task" [ 636.019789] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 636.027736] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848151, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 636.532879] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848151, 'name': CreateVM_Task, 'duration_secs': 0.320363} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 636.533858] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 636.533858] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 636.533858] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 636.534207] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 636.534444] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-78edf299-9c01-42a1-bad0-97807b70d034 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.540905] env[65918]: DEBUG oslo_vmware.api [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Waiting for the task: (returnval){ [ 636.540905] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52c308af-320e-1c94-1534-4fc359b88020" [ 636.540905] env[65918]: _type = "Task" [ 636.540905] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 636.549613] env[65918]: DEBUG oslo_vmware.api [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52c308af-320e-1c94-1534-4fc359b88020, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 636.666653] env[65918]: DEBUG nova.network.neutron [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Updating instance_info_cache with network_info: [{"id": "505919bd-2b6d-4cb7-8361-c62802ec1625", "address": "fa:16:3e:fb:cc:b8", "network": {"id": "523aa2b5-3ddc-4fee-a844-03a1be40ff81", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-225218807-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "aa2e2978302b4e76b809fddcba0eab40", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap505919bd-2b", "ovs_interfaceid": "505919bd-2b6d-4cb7-8361-c62802ec1625", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.698189] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Releasing lock "refresh_cache-bd0158bd-e255-4680-b00e-81eb1ce88ad5" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 636.698529] env[65918]: DEBUG nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Instance network_info: |[{"id": "505919bd-2b6d-4cb7-8361-c62802ec1625", "address": "fa:16:3e:fb:cc:b8", "network": {"id": "523aa2b5-3ddc-4fee-a844-03a1be40ff81", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-225218807-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "aa2e2978302b4e76b809fddcba0eab40", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap505919bd-2b", "ovs_interfaceid": "505919bd-2b6d-4cb7-8361-c62802ec1625", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 636.698951] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fb:cc:b8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3b67e519-46cf-44ce-b670-4ba4c0c5b658', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '505919bd-2b6d-4cb7-8361-c62802ec1625', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 636.712652] env[65918]: DEBUG oslo.service.loopingcall [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 636.712652] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 636.712652] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8ad3d0f9-c028-4104-a0e2-690850180d20 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.739628] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 636.739628] env[65918]: value = "task-2848152" [ 636.739628] env[65918]: _type = "Task" [ 636.739628] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 636.746706] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848152, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 637.056794] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 637.060294] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 637.060582] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 637.149298] env[65918]: DEBUG nova.network.neutron [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Successfully updated port: 529a38dc-110d-42e4-831e-b26a9d87491d {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 637.159078] env[65918]: DEBUG nova.compute.manager [req-a0e045d2-3ce2-49ff-8ede-3b85ee0095eb req-13aa7504-f3cd-469e-9b57-80a8f103a748 service nova] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Received event network-vif-plugged-76bbb4ff-1cd3-478e-9525-0704272c8ee3 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 637.159078] env[65918]: DEBUG oslo_concurrency.lockutils [req-a0e045d2-3ce2-49ff-8ede-3b85ee0095eb req-13aa7504-f3cd-469e-9b57-80a8f103a748 service nova] Acquiring lock "5c83d7da-f63b-40b7-a1aa-916ba9343439-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 637.159078] env[65918]: DEBUG oslo_concurrency.lockutils [req-a0e045d2-3ce2-49ff-8ede-3b85ee0095eb req-13aa7504-f3cd-469e-9b57-80a8f103a748 service nova] Lock "5c83d7da-f63b-40b7-a1aa-916ba9343439-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 637.159078] env[65918]: DEBUG oslo_concurrency.lockutils [req-a0e045d2-3ce2-49ff-8ede-3b85ee0095eb req-13aa7504-f3cd-469e-9b57-80a8f103a748 service nova] Lock "5c83d7da-f63b-40b7-a1aa-916ba9343439-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 637.159237] env[65918]: DEBUG nova.compute.manager [req-a0e045d2-3ce2-49ff-8ede-3b85ee0095eb req-13aa7504-f3cd-469e-9b57-80a8f103a748 service nova] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] No waiting events found dispatching network-vif-plugged-76bbb4ff-1cd3-478e-9525-0704272c8ee3 {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 637.159281] env[65918]: WARNING nova.compute.manager [req-a0e045d2-3ce2-49ff-8ede-3b85ee0095eb req-13aa7504-f3cd-469e-9b57-80a8f103a748 service nova] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Received unexpected event network-vif-plugged-76bbb4ff-1cd3-478e-9525-0704272c8ee3 for instance with vm_state building and task_state spawning. [ 637.160649] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Acquiring lock "refresh_cache-51163f89-c8b6-48a8-bbbe-de63c44d92a5" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 637.160807] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Acquired lock "refresh_cache-51163f89-c8b6-48a8-bbbe-de63c44d92a5" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 637.160953] env[65918]: DEBUG nova.network.neutron [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 637.193987] env[65918]: DEBUG nova.network.neutron [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Successfully updated port: 7fa47b80-ce9c-4251-8196-8939b09283ef {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 637.204510] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Acquiring lock "refresh_cache-32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 637.204788] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Acquired lock "refresh_cache-32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 637.204962] env[65918]: DEBUG nova.network.neutron [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 637.250309] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848152, 'name': CreateVM_Task, 'duration_secs': 0.315424} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 637.250565] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 637.251352] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 637.251542] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 637.251882] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 637.252454] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5b953710-7c76-4181-8340-c55bd6d2708f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.258063] env[65918]: DEBUG oslo_vmware.api [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Waiting for the task: (returnval){ [ 637.258063] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]5254691a-9ebc-5c84-eaf7-9fd6afccaf88" [ 637.258063] env[65918]: _type = "Task" [ 637.258063] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 637.267972] env[65918]: DEBUG oslo_vmware.api [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]5254691a-9ebc-5c84-eaf7-9fd6afccaf88, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 637.298493] env[65918]: DEBUG nova.network.neutron [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Updated VIF entry in instance network info cache for port 518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 637.298907] env[65918]: DEBUG nova.network.neutron [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Updating instance_info_cache with network_info: [{"id": "518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8", "address": "fa:16:3e:87:f2:5c", "network": {"id": "523aa2b5-3ddc-4fee-a844-03a1be40ff81", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-225218807-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "aa2e2978302b4e76b809fddcba0eab40", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap518f1e2f-c7", "ovs_interfaceid": "518f1e2f-c72f-4ba3-a1b2-10c57cde2fa8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 637.306975] env[65918]: DEBUG nova.network.neutron [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 637.311917] env[65918]: DEBUG oslo_concurrency.lockutils [req-72f0986e-0bcd-4b89-b925-ad7d5abdc095 req-a631ec2f-2249-488f-bafc-4df8c430e1a6 service nova] Releasing lock "refresh_cache-7bc8087e-17e1-4cbd-84be-bd6c07e104ce" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 637.323859] env[65918]: DEBUG nova.network.neutron [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 637.336307] env[65918]: DEBUG oslo_concurrency.lockutils [None req-9037c31d-c63c-44a3-9628-15ae8133f408 tempest-ServersAdminTestJSON-1207677598 tempest-ServersAdminTestJSON-1207677598-project-member] Acquiring lock "cda52e82-61cf-4cb4-a37a-61684af013dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 637.336307] env[65918]: DEBUG oslo_concurrency.lockutils [None req-9037c31d-c63c-44a3-9628-15ae8133f408 tempest-ServersAdminTestJSON-1207677598 tempest-ServersAdminTestJSON-1207677598-project-member] Lock "cda52e82-61cf-4cb4-a37a-61684af013dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 637.774181] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 637.774515] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 637.774740] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 637.791294] env[65918]: DEBUG nova.network.neutron [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Updating instance_info_cache with network_info: [{"id": "7fa47b80-ce9c-4251-8196-8939b09283ef", "address": "fa:16:3e:04:cf:1a", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.149", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7fa47b80-ce", "ovs_interfaceid": "7fa47b80-ce9c-4251-8196-8939b09283ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 637.815884] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Releasing lock "refresh_cache-32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 637.817705] env[65918]: DEBUG nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Instance network_info: |[{"id": "7fa47b80-ce9c-4251-8196-8939b09283ef", "address": "fa:16:3e:04:cf:1a", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.149", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7fa47b80-ce", "ovs_interfaceid": "7fa47b80-ce9c-4251-8196-8939b09283ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 637.817854] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:04:cf:1a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c0d5204b-f60e-4830-84c8-2fe246c28202', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7fa47b80-ce9c-4251-8196-8939b09283ef', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 637.830563] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Creating folder: Project (c9e9e67e7dca4ca6b7485c853433db5d). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 637.832288] env[65918]: DEBUG nova.network.neutron [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Updating instance_info_cache with network_info: [{"id": "529a38dc-110d-42e4-831e-b26a9d87491d", "address": "fa:16:3e:4c:f3:38", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.110", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap529a38dc-11", "ovs_interfaceid": "529a38dc-110d-42e4-831e-b26a9d87491d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 637.834857] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3b9a2aec-37f4-400b-a11c-c2d9309aaaea {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.843998] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Releasing lock "refresh_cache-51163f89-c8b6-48a8-bbbe-de63c44d92a5" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 637.844332] env[65918]: DEBUG nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Instance network_info: |[{"id": "529a38dc-110d-42e4-831e-b26a9d87491d", "address": "fa:16:3e:4c:f3:38", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.110", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap529a38dc-11", "ovs_interfaceid": "529a38dc-110d-42e4-831e-b26a9d87491d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 637.844695] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4c:f3:38', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c0d5204b-f60e-4830-84c8-2fe246c28202', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '529a38dc-110d-42e4-831e-b26a9d87491d', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 637.853414] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Creating folder: Project (71110178a1894ffe9c88961a1261fbe0). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 637.855491] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a4c2badd-6691-4128-91cb-d1b8e3f9ec04 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.860130] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Created folder: Project (c9e9e67e7dca4ca6b7485c853433db5d) in parent group-v572679. [ 637.860130] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Creating folder: Instances. Parent ref: group-v572702. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 637.860560] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-70cc3583-a165-4c51-ba2f-47780061085a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.873219] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Created folder: Instances in parent group-v572702. [ 637.873219] env[65918]: DEBUG oslo.service.loopingcall [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 637.873352] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 637.873700] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Created folder: Project (71110178a1894ffe9c88961a1261fbe0) in parent group-v572679. [ 637.873974] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Creating folder: Instances. Parent ref: group-v572703. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 637.874553] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-59a97bc8-4ae3-4dc3-ab39-c9e5fde59209 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.892445] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e8e676ca-395c-4daa-b3f2-2cc465cc0ec3 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.904145] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 637.904145] env[65918]: value = "task-2848157" [ 637.904145] env[65918]: _type = "Task" [ 637.904145] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 637.910989] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848157, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 637.913092] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Created folder: Instances in parent group-v572703. [ 637.913458] env[65918]: DEBUG oslo.service.loopingcall [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 637.913743] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 637.914042] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-14534b15-7e69-45ba-b6e1-cc309191b98f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.937111] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 637.937111] env[65918]: value = "task-2848158" [ 637.937111] env[65918]: _type = "Task" [ 637.937111] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 637.952719] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848158, 'name': CreateVM_Task} progress is 5%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 638.144225] env[65918]: DEBUG nova.compute.manager [req-94b194ad-d71b-4fda-bab1-8274df24f30f req-11b9ba9d-4549-4105-ad2d-aed03deb953d service nova] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Received event network-vif-plugged-505919bd-2b6d-4cb7-8361-c62802ec1625 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 638.144225] env[65918]: DEBUG oslo_concurrency.lockutils [req-94b194ad-d71b-4fda-bab1-8274df24f30f req-11b9ba9d-4549-4105-ad2d-aed03deb953d service nova] Acquiring lock "bd0158bd-e255-4680-b00e-81eb1ce88ad5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 638.144225] env[65918]: DEBUG oslo_concurrency.lockutils [req-94b194ad-d71b-4fda-bab1-8274df24f30f req-11b9ba9d-4549-4105-ad2d-aed03deb953d service nova] Lock "bd0158bd-e255-4680-b00e-81eb1ce88ad5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 638.144225] env[65918]: DEBUG oslo_concurrency.lockutils [req-94b194ad-d71b-4fda-bab1-8274df24f30f req-11b9ba9d-4549-4105-ad2d-aed03deb953d service nova] Lock "bd0158bd-e255-4680-b00e-81eb1ce88ad5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 638.145115] env[65918]: DEBUG nova.compute.manager [req-94b194ad-d71b-4fda-bab1-8274df24f30f req-11b9ba9d-4549-4105-ad2d-aed03deb953d service nova] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] No waiting events found dispatching network-vif-plugged-505919bd-2b6d-4cb7-8361-c62802ec1625 {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 638.145115] env[65918]: WARNING nova.compute.manager [req-94b194ad-d71b-4fda-bab1-8274df24f30f req-11b9ba9d-4549-4105-ad2d-aed03deb953d service nova] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Received unexpected event network-vif-plugged-505919bd-2b6d-4cb7-8361-c62802ec1625 for instance with vm_state building and task_state spawning. [ 638.415210] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848157, 'name': CreateVM_Task, 'duration_secs': 0.458438} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 638.415210] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 638.415372] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 638.415412] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 638.415699] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 638.416238] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a0e43e62-1b5f-4a0f-a76b-7f7e3e4d7a78 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.421865] env[65918]: DEBUG oslo_vmware.api [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Waiting for the task: (returnval){ [ 638.421865] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]528b32d8-4b02-a6a3-036a-5c422eb20c63" [ 638.421865] env[65918]: _type = "Task" [ 638.421865] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 638.434598] env[65918]: DEBUG oslo_vmware.api [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]528b32d8-4b02-a6a3-036a-5c422eb20c63, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 638.445943] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848158, 'name': CreateVM_Task, 'duration_secs': 0.34787} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 638.445943] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 638.446543] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 638.938979] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 638.939333] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 638.939621] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 638.939845] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 638.941744] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 638.942039] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-91df27fd-84c0-4672-a9b9-5594665f2241 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.948621] env[65918]: DEBUG oslo_vmware.api [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Waiting for the task: (returnval){ [ 638.948621] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]520df388-b136-7549-d491-82d360100661" [ 638.948621] env[65918]: _type = "Task" [ 638.948621] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 638.960878] env[65918]: DEBUG oslo_vmware.api [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]520df388-b136-7549-d491-82d360100661, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 639.463093] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 639.463402] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 639.463691] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 641.266771] env[65918]: DEBUG nova.compute.manager [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Received event network-changed-76bbb4ff-1cd3-478e-9525-0704272c8ee3 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 641.266771] env[65918]: DEBUG nova.compute.manager [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Refreshing instance network info cache due to event network-changed-76bbb4ff-1cd3-478e-9525-0704272c8ee3. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 641.266771] env[65918]: DEBUG oslo_concurrency.lockutils [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] Acquiring lock "refresh_cache-5c83d7da-f63b-40b7-a1aa-916ba9343439" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 641.267104] env[65918]: DEBUG oslo_concurrency.lockutils [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] Acquired lock "refresh_cache-5c83d7da-f63b-40b7-a1aa-916ba9343439" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 641.267104] env[65918]: DEBUG nova.network.neutron [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Refreshing network info cache for port 76bbb4ff-1cd3-478e-9525-0704272c8ee3 {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 641.762985] env[65918]: DEBUG nova.network.neutron [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Updated VIF entry in instance network info cache for port 76bbb4ff-1cd3-478e-9525-0704272c8ee3. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 641.764875] env[65918]: DEBUG nova.network.neutron [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Updating instance_info_cache with network_info: [{"id": "76bbb4ff-1cd3-478e-9525-0704272c8ee3", "address": "fa:16:3e:66:6f:bd", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.97", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap76bbb4ff-1c", "ovs_interfaceid": "76bbb4ff-1cd3-478e-9525-0704272c8ee3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 641.775788] env[65918]: DEBUG oslo_concurrency.lockutils [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] Releasing lock "refresh_cache-5c83d7da-f63b-40b7-a1aa-916ba9343439" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 641.776048] env[65918]: DEBUG nova.compute.manager [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Received event network-vif-plugged-529a38dc-110d-42e4-831e-b26a9d87491d {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 641.776234] env[65918]: DEBUG oslo_concurrency.lockutils [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] Acquiring lock "51163f89-c8b6-48a8-bbbe-de63c44d92a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 641.776435] env[65918]: DEBUG oslo_concurrency.lockutils [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] Lock "51163f89-c8b6-48a8-bbbe-de63c44d92a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 641.776824] env[65918]: DEBUG oslo_concurrency.lockutils [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] Lock "51163f89-c8b6-48a8-bbbe-de63c44d92a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 641.776824] env[65918]: DEBUG nova.compute.manager [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] No waiting events found dispatching network-vif-plugged-529a38dc-110d-42e4-831e-b26a9d87491d {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 641.776901] env[65918]: WARNING nova.compute.manager [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Received unexpected event network-vif-plugged-529a38dc-110d-42e4-831e-b26a9d87491d for instance with vm_state building and task_state spawning. [ 641.777477] env[65918]: DEBUG nova.compute.manager [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Received event network-changed-529a38dc-110d-42e4-831e-b26a9d87491d {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 641.777477] env[65918]: DEBUG nova.compute.manager [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Refreshing instance network info cache due to event network-changed-529a38dc-110d-42e4-831e-b26a9d87491d. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 641.777477] env[65918]: DEBUG oslo_concurrency.lockutils [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] Acquiring lock "refresh_cache-51163f89-c8b6-48a8-bbbe-de63c44d92a5" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 641.777477] env[65918]: DEBUG oslo_concurrency.lockutils [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] Acquired lock "refresh_cache-51163f89-c8b6-48a8-bbbe-de63c44d92a5" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 641.777657] env[65918]: DEBUG nova.network.neutron [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Refreshing network info cache for port 529a38dc-110d-42e4-831e-b26a9d87491d {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 641.960814] env[65918]: DEBUG nova.compute.manager [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Received event network-changed-505919bd-2b6d-4cb7-8361-c62802ec1625 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 641.960991] env[65918]: DEBUG nova.compute.manager [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Refreshing instance network info cache due to event network-changed-505919bd-2b6d-4cb7-8361-c62802ec1625. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 641.961213] env[65918]: DEBUG oslo_concurrency.lockutils [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] Acquiring lock "refresh_cache-bd0158bd-e255-4680-b00e-81eb1ce88ad5" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 641.961349] env[65918]: DEBUG oslo_concurrency.lockutils [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] Acquired lock "refresh_cache-bd0158bd-e255-4680-b00e-81eb1ce88ad5" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 641.961505] env[65918]: DEBUG nova.network.neutron [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Refreshing network info cache for port 505919bd-2b6d-4cb7-8361-c62802ec1625 {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 642.180218] env[65918]: DEBUG nova.network.neutron [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Updated VIF entry in instance network info cache for port 529a38dc-110d-42e4-831e-b26a9d87491d. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 642.181058] env[65918]: DEBUG nova.network.neutron [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Updating instance_info_cache with network_info: [{"id": "529a38dc-110d-42e4-831e-b26a9d87491d", "address": "fa:16:3e:4c:f3:38", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.110", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap529a38dc-11", "ovs_interfaceid": "529a38dc-110d-42e4-831e-b26a9d87491d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 642.193824] env[65918]: DEBUG oslo_concurrency.lockutils [req-96043ed9-7ad7-4ebd-a6ca-5fbcbd539102 req-4011d17e-3256-4ea8-baad-bb5405de68c6 service nova] Releasing lock "refresh_cache-51163f89-c8b6-48a8-bbbe-de63c44d92a5" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 642.522537] env[65918]: DEBUG nova.network.neutron [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Updated VIF entry in instance network info cache for port 505919bd-2b6d-4cb7-8361-c62802ec1625. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 642.522921] env[65918]: DEBUG nova.network.neutron [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Updating instance_info_cache with network_info: [{"id": "505919bd-2b6d-4cb7-8361-c62802ec1625", "address": "fa:16:3e:fb:cc:b8", "network": {"id": "523aa2b5-3ddc-4fee-a844-03a1be40ff81", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-225218807-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "aa2e2978302b4e76b809fddcba0eab40", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap505919bd-2b", "ovs_interfaceid": "505919bd-2b6d-4cb7-8361-c62802ec1625", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 642.538588] env[65918]: DEBUG oslo_concurrency.lockutils [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] Releasing lock "refresh_cache-bd0158bd-e255-4680-b00e-81eb1ce88ad5" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 642.538588] env[65918]: DEBUG nova.compute.manager [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Received event network-vif-plugged-7fa47b80-ce9c-4251-8196-8939b09283ef {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 642.538588] env[65918]: DEBUG oslo_concurrency.lockutils [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] Acquiring lock "32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 642.538588] env[65918]: DEBUG oslo_concurrency.lockutils [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] Lock "32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 642.538787] env[65918]: DEBUG oslo_concurrency.lockutils [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] Lock "32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 642.538787] env[65918]: DEBUG nova.compute.manager [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] No waiting events found dispatching network-vif-plugged-7fa47b80-ce9c-4251-8196-8939b09283ef {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 642.538787] env[65918]: WARNING nova.compute.manager [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Received unexpected event network-vif-plugged-7fa47b80-ce9c-4251-8196-8939b09283ef for instance with vm_state building and task_state spawning. [ 642.538787] env[65918]: DEBUG nova.compute.manager [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Received event network-changed-7fa47b80-ce9c-4251-8196-8939b09283ef {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 642.538891] env[65918]: DEBUG nova.compute.manager [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Refreshing instance network info cache due to event network-changed-7fa47b80-ce9c-4251-8196-8939b09283ef. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 642.538891] env[65918]: DEBUG oslo_concurrency.lockutils [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] Acquiring lock "refresh_cache-32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 642.541157] env[65918]: DEBUG oslo_concurrency.lockutils [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] Acquired lock "refresh_cache-32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 642.542174] env[65918]: DEBUG nova.network.neutron [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Refreshing network info cache for port 7fa47b80-ce9c-4251-8196-8939b09283ef {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 643.411770] env[65918]: DEBUG nova.network.neutron [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Updated VIF entry in instance network info cache for port 7fa47b80-ce9c-4251-8196-8939b09283ef. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 643.413999] env[65918]: DEBUG nova.network.neutron [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Updating instance_info_cache with network_info: [{"id": "7fa47b80-ce9c-4251-8196-8939b09283ef", "address": "fa:16:3e:04:cf:1a", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.149", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7fa47b80-ce", "ovs_interfaceid": "7fa47b80-ce9c-4251-8196-8939b09283ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 643.426390] env[65918]: DEBUG oslo_concurrency.lockutils [req-24081424-32d6-4a6c-a8c7-d22aea66eb71 req-53726245-2d1e-4e06-8d64-78d11c06d78e service nova] Releasing lock "refresh_cache-32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 666.513712] env[65918]: WARNING oslo_vmware.rw_handles [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 666.513712] env[65918]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 666.513712] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 666.513712] env[65918]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 666.513712] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 666.513712] env[65918]: ERROR oslo_vmware.rw_handles response.begin() [ 666.513712] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 666.513712] env[65918]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 666.513712] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 666.513712] env[65918]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 666.513712] env[65918]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 666.513712] env[65918]: ERROR oslo_vmware.rw_handles [ 666.517436] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Downloaded image file data e017c336-3a02-4b58-874a-44a1d1e154fd to vmware_temp/d4bf6097-fc19-4c11-bfe6-98f4caf0b216/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 666.517436] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Caching image {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 666.517806] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Copying Virtual Disk [datastore1] vmware_temp/d4bf6097-fc19-4c11-bfe6-98f4caf0b216/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk to [datastore1] vmware_temp/d4bf6097-fc19-4c11-bfe6-98f4caf0b216/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk {{(pid=65918) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 666.519378] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1d9d7803-f68a-4ce1-9342-1ca37f53c764 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 666.528184] env[65918]: DEBUG oslo_vmware.api [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Waiting for the task: (returnval){ [ 666.528184] env[65918]: value = "task-2848167" [ 666.528184] env[65918]: _type = "Task" [ 666.528184] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 666.537352] env[65918]: DEBUG oslo_vmware.api [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Task: {'id': task-2848167, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 667.046896] env[65918]: DEBUG oslo_vmware.exceptions [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Fault InvalidArgument not matched. {{(pid=65918) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 667.046896] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 667.050164] env[65918]: ERROR nova.compute.manager [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 667.050164] env[65918]: Faults: ['InvalidArgument'] [ 667.050164] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Traceback (most recent call last): [ 667.050164] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 667.050164] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] yield resources [ 667.050164] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 667.050164] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] self.driver.spawn(context, instance, image_meta, [ 667.050164] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 667.050164] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] self._vmops.spawn(context, instance, image_meta, injected_files, [ 667.050164] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 667.050164] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] self._fetch_image_if_missing(context, vi) [ 667.050164] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 667.050535] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] image_cache(vi, tmp_image_ds_loc) [ 667.050535] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 667.050535] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] vm_util.copy_virtual_disk( [ 667.050535] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 667.050535] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] session._wait_for_task(vmdk_copy_task) [ 667.050535] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 667.050535] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] return self.wait_for_task(task_ref) [ 667.050535] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 667.050535] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] return evt.wait() [ 667.050535] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 667.050535] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] result = hub.switch() [ 667.050535] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 667.050535] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] return self.greenlet.switch() [ 667.050954] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 667.050954] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] self.f(*self.args, **self.kw) [ 667.050954] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 667.050954] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] raise exceptions.translate_fault(task_info.error) [ 667.050954] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 667.050954] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Faults: ['InvalidArgument'] [ 667.050954] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] [ 667.050954] env[65918]: INFO nova.compute.manager [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Terminating instance [ 667.050954] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 667.051276] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 667.051276] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Acquiring lock "refresh_cache-2efc86dd-575c-4d78-a5ca-592f077655de" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 667.051393] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Acquired lock "refresh_cache-2efc86dd-575c-4d78-a5ca-592f077655de" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 667.052334] env[65918]: DEBUG nova.network.neutron [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 667.056403] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-94e8c4d6-e1a4-4e06-899d-a926029b1597 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.063959] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 667.064241] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 667.065751] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ce5ccba5-4c42-41bb-9c78-60566ca9d6ec {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.074977] env[65918]: DEBUG oslo_vmware.api [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Waiting for the task: (returnval){ [ 667.074977] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52b484d1-2d70-7973-7d14-a1853c009868" [ 667.074977] env[65918]: _type = "Task" [ 667.074977] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 667.087024] env[65918]: DEBUG oslo_vmware.api [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52b484d1-2d70-7973-7d14-a1853c009868, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 667.161872] env[65918]: DEBUG nova.network.neutron [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 667.590614] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 667.591022] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Creating directory with path [datastore1] vmware_temp/211e94cd-a6f1-46ca-b3b9-6e70b6db4f88/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 667.591289] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5b97c1a8-2bf1-461c-bcdd-b3b2cd11513f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.607123] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Created directory with path [datastore1] vmware_temp/211e94cd-a6f1-46ca-b3b9-6e70b6db4f88/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 667.607123] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Fetch image to [datastore1] vmware_temp/211e94cd-a6f1-46ca-b3b9-6e70b6db4f88/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 667.607123] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/211e94cd-a6f1-46ca-b3b9-6e70b6db4f88/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 667.607587] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fe9363f-b749-4a9c-92fb-69088893c6e7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.615465] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f3450fa-42df-4f75-ab10-0eac1960ead7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.626266] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19c5f12a-3891-4d8b-8eae-d4927bea4e4a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.664291] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8526f21-3c4f-481e-b751-ab88076dd5e4 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.672372] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-79bc7e20-d729-428b-bf51-61ca8b4bb916 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.696292] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 667.753564] env[65918]: DEBUG oslo_vmware.rw_handles [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/211e94cd-a6f1-46ca-b3b9-6e70b6db4f88/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 667.819877] env[65918]: DEBUG nova.network.neutron [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 667.824280] env[65918]: DEBUG oslo_vmware.rw_handles [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Completed reading data from the image iterator. {{(pid=65918) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 667.824579] env[65918]: DEBUG oslo_vmware.rw_handles [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/211e94cd-a6f1-46ca-b3b9-6e70b6db4f88/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 667.834064] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Releasing lock "refresh_cache-2efc86dd-575c-4d78-a5ca-592f077655de" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 667.834588] env[65918]: DEBUG nova.compute.manager [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 667.834833] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 667.836118] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a72b77a-26a8-4888-98e5-0a6cdec9919b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.846719] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 667.846934] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b7fa22fe-c4d0-4cb2-8f16-50d2d52553f7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.876058] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 667.876058] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 667.876058] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Deleting the datastore file [datastore1] 2efc86dd-575c-4d78-a5ca-592f077655de {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 667.876058] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-677451f0-67c0-4003-89ef-99a22b3448cf {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.882401] env[65918]: DEBUG oslo_vmware.api [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Waiting for the task: (returnval){ [ 667.882401] env[65918]: value = "task-2848170" [ 667.882401] env[65918]: _type = "Task" [ 667.882401] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 667.893761] env[65918]: DEBUG oslo_vmware.api [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Task: {'id': task-2848170, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 668.396685] env[65918]: DEBUG oslo_vmware.api [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Task: {'id': task-2848170, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.034772} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 668.396984] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 668.397219] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 668.397474] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 668.398030] env[65918]: INFO nova.compute.manager [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Took 0.56 seconds to destroy the instance on the hypervisor. [ 668.398448] env[65918]: DEBUG oslo.service.loopingcall [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 668.398607] env[65918]: DEBUG nova.compute.manager [-] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Skipping network deallocation for instance since networking was not requested. {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 668.402954] env[65918]: DEBUG nova.compute.claims [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 668.403206] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.403437] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.718143] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-637cd108-b58b-41bd-9da1-aef883152aba {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.734019] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e06dc2e-3381-4872-97ac-6db8872fa80f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.771097] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-add62ddf-b5aa-4b21-9c12-89c90867cfa2 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.778331] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-791179f2-7180-45c7-b934-ed7149030894 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.792778] env[65918]: DEBUG nova.compute.provider_tree [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 668.802525] env[65918]: DEBUG nova.scheduler.client.report [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 668.825504] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.422s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.826126] env[65918]: ERROR nova.compute.manager [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 668.826126] env[65918]: Faults: ['InvalidArgument'] [ 668.826126] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Traceback (most recent call last): [ 668.826126] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 668.826126] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] self.driver.spawn(context, instance, image_meta, [ 668.826126] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 668.826126] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] self._vmops.spawn(context, instance, image_meta, injected_files, [ 668.826126] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 668.826126] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] self._fetch_image_if_missing(context, vi) [ 668.826126] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 668.826126] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] image_cache(vi, tmp_image_ds_loc) [ 668.826126] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 668.826500] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] vm_util.copy_virtual_disk( [ 668.826500] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 668.826500] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] session._wait_for_task(vmdk_copy_task) [ 668.826500] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 668.826500] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] return self.wait_for_task(task_ref) [ 668.826500] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 668.826500] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] return evt.wait() [ 668.826500] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 668.826500] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] result = hub.switch() [ 668.826500] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 668.826500] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] return self.greenlet.switch() [ 668.826500] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 668.826500] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] self.f(*self.args, **self.kw) [ 668.826897] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 668.826897] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] raise exceptions.translate_fault(task_info.error) [ 668.826897] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 668.826897] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Faults: ['InvalidArgument'] [ 668.826897] env[65918]: ERROR nova.compute.manager [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] [ 668.827136] env[65918]: DEBUG nova.compute.utils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] VimFaultException {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 668.830238] env[65918]: DEBUG nova.compute.manager [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Build of instance 2efc86dd-575c-4d78-a5ca-592f077655de was re-scheduled: A specified parameter was not correct: fileType [ 668.830238] env[65918]: Faults: ['InvalidArgument'] {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 668.830663] env[65918]: DEBUG nova.compute.manager [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 668.830922] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Acquiring lock "refresh_cache-2efc86dd-575c-4d78-a5ca-592f077655de" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 668.831126] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Acquired lock "refresh_cache-2efc86dd-575c-4d78-a5ca-592f077655de" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 668.831583] env[65918]: DEBUG nova.network.neutron [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 668.956183] env[65918]: DEBUG nova.network.neutron [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 669.287821] env[65918]: DEBUG nova.network.neutron [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 669.301694] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Releasing lock "refresh_cache-2efc86dd-575c-4d78-a5ca-592f077655de" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 669.302997] env[65918]: DEBUG nova.compute.manager [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 669.302997] env[65918]: DEBUG nova.compute.manager [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] [instance: 2efc86dd-575c-4d78-a5ca-592f077655de] Skipping network deallocation for instance since networking was not requested. {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 669.408244] env[65918]: INFO nova.scheduler.client.report [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Deleted allocations for instance 2efc86dd-575c-4d78-a5ca-592f077655de [ 669.429724] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dc6d4743-58ab-4c2e-9821-4f380e738545 tempest-ServersAaction247Test-310550292 tempest-ServersAaction247Test-310550292-project-member] Lock "2efc86dd-575c-4d78-a5ca-592f077655de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 52.865s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 669.479269] env[65918]: DEBUG nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 669.542367] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 669.542662] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 669.544443] env[65918]: INFO nova.compute.claims [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 669.778953] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-186af0fe-9e47-42d8-b737-77643ee14fc4 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.795292] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6898dd70-5106-49f2-9b1e-45274087f9dc {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.833562] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f39d5bd9-d9b0-4309-8a74-af894cb421d0 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.846186] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b453304-d061-4d14-bf64-439df247be68 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.862982] env[65918]: DEBUG nova.compute.provider_tree [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 669.871693] env[65918]: DEBUG nova.scheduler.client.report [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 669.890349] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.348s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 669.890887] env[65918]: DEBUG nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 669.939034] env[65918]: DEBUG nova.compute.utils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 669.940402] env[65918]: DEBUG nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 669.940804] env[65918]: DEBUG nova.network.neutron [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 669.955148] env[65918]: DEBUG nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 670.031685] env[65918]: DEBUG nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 670.057572] env[65918]: DEBUG nova.virt.hardware [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 670.057842] env[65918]: DEBUG nova.virt.hardware [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 670.059039] env[65918]: DEBUG nova.virt.hardware [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 670.059039] env[65918]: DEBUG nova.virt.hardware [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 670.059039] env[65918]: DEBUG nova.virt.hardware [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 670.059039] env[65918]: DEBUG nova.virt.hardware [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 670.059039] env[65918]: DEBUG nova.virt.hardware [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 670.059298] env[65918]: DEBUG nova.virt.hardware [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 670.059298] env[65918]: DEBUG nova.virt.hardware [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 670.059298] env[65918]: DEBUG nova.virt.hardware [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 670.059298] env[65918]: DEBUG nova.virt.hardware [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 670.060177] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eee51a1a-1a64-43f6-bbdc-875787a25412 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 670.070880] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-957864e9-e326-41d4-81aa-0e0668c969f5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 670.270211] env[65918]: DEBUG nova.policy [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8fe5474b93f438faef708a22f536612', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '970346d6c53140b586f42829eff42654', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 672.123578] env[65918]: DEBUG nova.network.neutron [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Successfully created port: db3729cb-f676-4563-8e47-aebdc9956013 {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 674.719874] env[65918]: DEBUG nova.network.neutron [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Successfully updated port: db3729cb-f676-4563-8e47-aebdc9956013 {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 674.738069] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Acquiring lock "refresh_cache-bba6f3d9-1be3-4048-86d5-f435511b0fc0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 674.738204] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Acquired lock "refresh_cache-bba6f3d9-1be3-4048-86d5-f435511b0fc0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 674.738354] env[65918]: DEBUG nova.network.neutron [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 675.104341] env[65918]: DEBUG nova.network.neutron [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 676.188227] env[65918]: DEBUG nova.network.neutron [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Updating instance_info_cache with network_info: [{"id": "db3729cb-f676-4563-8e47-aebdc9956013", "address": "fa:16:3e:f8:e9:af", "network": {"id": "3e685c3a-fa1d-4a25-86c2-6406f7b0965b", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1790113604-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "970346d6c53140b586f42829eff42654", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bd3c6b64-aba2-4bdc-a693-3b4dff3ed861", "external-id": "nsx-vlan-transportzone-600", "segmentation_id": 600, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdb3729cb-f6", "ovs_interfaceid": "db3729cb-f676-4563-8e47-aebdc9956013", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 676.209615] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Releasing lock "refresh_cache-bba6f3d9-1be3-4048-86d5-f435511b0fc0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 676.209973] env[65918]: DEBUG nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Instance network_info: |[{"id": "db3729cb-f676-4563-8e47-aebdc9956013", "address": "fa:16:3e:f8:e9:af", "network": {"id": "3e685c3a-fa1d-4a25-86c2-6406f7b0965b", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1790113604-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "970346d6c53140b586f42829eff42654", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bd3c6b64-aba2-4bdc-a693-3b4dff3ed861", "external-id": "nsx-vlan-transportzone-600", "segmentation_id": 600, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdb3729cb-f6", "ovs_interfaceid": "db3729cb-f676-4563-8e47-aebdc9956013", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 676.210613] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f8:e9:af', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'bd3c6b64-aba2-4bdc-a693-3b4dff3ed861', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'db3729cb-f676-4563-8e47-aebdc9956013', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 676.223010] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Creating folder: Project (970346d6c53140b586f42829eff42654). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 676.223630] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-564dd80a-7280-488e-a776-e8781b179290 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.235785] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Created folder: Project (970346d6c53140b586f42829eff42654) in parent group-v572679. [ 676.235785] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Creating folder: Instances. Parent ref: group-v572712. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 676.235785] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fe2e3e3e-9eb4-4673-a03a-1f0582e3002f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.246212] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Created folder: Instances in parent group-v572712. [ 676.246212] env[65918]: DEBUG oslo.service.loopingcall [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 676.246212] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 676.246449] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ce24d6ba-41c4-448a-a1b5-34081f27b5b9 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.274921] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 676.274921] env[65918]: value = "task-2848175" [ 676.274921] env[65918]: _type = "Task" [ 676.274921] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 676.285417] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848175, 'name': CreateVM_Task} progress is 5%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 676.788845] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848175, 'name': CreateVM_Task, 'duration_secs': 0.321785} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 676.789188] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 676.789993] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 676.790440] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 676.791176] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 676.791676] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-273f82fa-d98f-4431-ad15-21dfb96384d1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.796980] env[65918]: DEBUG oslo_vmware.api [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Waiting for the task: (returnval){ [ 676.796980] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52f16d57-7457-ffcb-e9ff-9e0c36db194b" [ 676.796980] env[65918]: _type = "Task" [ 676.796980] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 676.806292] env[65918]: DEBUG oslo_vmware.api [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52f16d57-7457-ffcb-e9ff-9e0c36db194b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 677.309459] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 677.310267] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 677.310600] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 677.843626] env[65918]: DEBUG nova.compute.manager [req-c65c836f-48be-445a-99e9-28ab6bd55bdb req-a00b54ee-ce67-4f60-82ed-64465cd19444 service nova] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Received event network-vif-plugged-db3729cb-f676-4563-8e47-aebdc9956013 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 677.843836] env[65918]: DEBUG oslo_concurrency.lockutils [req-c65c836f-48be-445a-99e9-28ab6bd55bdb req-a00b54ee-ce67-4f60-82ed-64465cd19444 service nova] Acquiring lock "bba6f3d9-1be3-4048-86d5-f435511b0fc0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 677.844064] env[65918]: DEBUG oslo_concurrency.lockutils [req-c65c836f-48be-445a-99e9-28ab6bd55bdb req-a00b54ee-ce67-4f60-82ed-64465cd19444 service nova] Lock "bba6f3d9-1be3-4048-86d5-f435511b0fc0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 677.844329] env[65918]: DEBUG oslo_concurrency.lockutils [req-c65c836f-48be-445a-99e9-28ab6bd55bdb req-a00b54ee-ce67-4f60-82ed-64465cd19444 service nova] Lock "bba6f3d9-1be3-4048-86d5-f435511b0fc0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 677.845574] env[65918]: DEBUG nova.compute.manager [req-c65c836f-48be-445a-99e9-28ab6bd55bdb req-a00b54ee-ce67-4f60-82ed-64465cd19444 service nova] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] No waiting events found dispatching network-vif-plugged-db3729cb-f676-4563-8e47-aebdc9956013 {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 677.845649] env[65918]: WARNING nova.compute.manager [req-c65c836f-48be-445a-99e9-28ab6bd55bdb req-a00b54ee-ce67-4f60-82ed-64465cd19444 service nova] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Received unexpected event network-vif-plugged-db3729cb-f676-4563-8e47-aebdc9956013 for instance with vm_state building and task_state spawning. [ 681.220849] env[65918]: DEBUG nova.compute.manager [req-8c432ab7-7c35-4538-83c6-a0d78aefb93b req-74f86a7a-86e3-4178-96e7-8c4ffb657237 service nova] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Received event network-changed-db3729cb-f676-4563-8e47-aebdc9956013 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 681.221146] env[65918]: DEBUG nova.compute.manager [req-8c432ab7-7c35-4538-83c6-a0d78aefb93b req-74f86a7a-86e3-4178-96e7-8c4ffb657237 service nova] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Refreshing instance network info cache due to event network-changed-db3729cb-f676-4563-8e47-aebdc9956013. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 681.221306] env[65918]: DEBUG oslo_concurrency.lockutils [req-8c432ab7-7c35-4538-83c6-a0d78aefb93b req-74f86a7a-86e3-4178-96e7-8c4ffb657237 service nova] Acquiring lock "refresh_cache-bba6f3d9-1be3-4048-86d5-f435511b0fc0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 681.221449] env[65918]: DEBUG oslo_concurrency.lockutils [req-8c432ab7-7c35-4538-83c6-a0d78aefb93b req-74f86a7a-86e3-4178-96e7-8c4ffb657237 service nova] Acquired lock "refresh_cache-bba6f3d9-1be3-4048-86d5-f435511b0fc0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 681.221612] env[65918]: DEBUG nova.network.neutron [req-8c432ab7-7c35-4538-83c6-a0d78aefb93b req-74f86a7a-86e3-4178-96e7-8c4ffb657237 service nova] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Refreshing network info cache for port db3729cb-f676-4563-8e47-aebdc9956013 {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 682.265641] env[65918]: DEBUG nova.network.neutron [req-8c432ab7-7c35-4538-83c6-a0d78aefb93b req-74f86a7a-86e3-4178-96e7-8c4ffb657237 service nova] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Updated VIF entry in instance network info cache for port db3729cb-f676-4563-8e47-aebdc9956013. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 682.265980] env[65918]: DEBUG nova.network.neutron [req-8c432ab7-7c35-4538-83c6-a0d78aefb93b req-74f86a7a-86e3-4178-96e7-8c4ffb657237 service nova] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Updating instance_info_cache with network_info: [{"id": "db3729cb-f676-4563-8e47-aebdc9956013", "address": "fa:16:3e:f8:e9:af", "network": {"id": "3e685c3a-fa1d-4a25-86c2-6406f7b0965b", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1790113604-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "970346d6c53140b586f42829eff42654", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "bd3c6b64-aba2-4bdc-a693-3b4dff3ed861", "external-id": "nsx-vlan-transportzone-600", "segmentation_id": 600, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdb3729cb-f6", "ovs_interfaceid": "db3729cb-f676-4563-8e47-aebdc9956013", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 682.281467] env[65918]: DEBUG oslo_concurrency.lockutils [req-8c432ab7-7c35-4538-83c6-a0d78aefb93b req-74f86a7a-86e3-4178-96e7-8c4ffb657237 service nova] Releasing lock "refresh_cache-bba6f3d9-1be3-4048-86d5-f435511b0fc0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 690.940492] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 690.940492] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 690.992605] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 690.992605] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Starting heal instance info cache {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 690.992605] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Rebuilding the list of instances to heal {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 691.075921] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 691.077222] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 691.077222] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 691.077507] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 691.077844] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 691.081015] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 691.081015] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 691.081015] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 691.081015] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 691.081015] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 691.081205] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Didn't find any instances for network info cache update. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 691.083479] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 691.083547] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 691.083676] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65918) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 691.083836] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager.update_available_resource {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 691.106417] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.106417] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.106417] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.106417] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65918) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 691.106417] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61f1ee5b-1b14-4a69-8421-190923f3c5c4 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.116642] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19cf5cd4-3557-4ab4-ab41-9f95e9c2f55b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.132299] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3a47ee6-ed7c-493c-932d-903f8e631954 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.141729] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-521033d2-723d-45e4-9317-f3f3f9a15c89 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.191751] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181065MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65918) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 691.191751] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.191751] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.308950] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance d34229ba-b110-41aa-b68f-e2d107fd817e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.309147] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 169c3642-1229-4c49-9e04-67e4e1764286 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.309282] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance cf0087c7-22d0-4317-a00a-73967ccafeaa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.309409] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 7bc8087e-17e1-4cbd-84be-bd6c07e104ce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.309534] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 5c83d7da-f63b-40b7-a1aa-916ba9343439 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.309654] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bd0158bd-e255-4680-b00e-81eb1ce88ad5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.309811] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 51163f89-c8b6-48a8-bbbe-de63c44d92a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.309885] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.309994] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 934b5745-6c9a-4d21-92b8-7505a170e600 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.310144] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bba6f3d9-1be3-4048-86d5-f435511b0fc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 691.310480] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 691.310535] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 691.499971] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7392501-0326-4213-8949-edd5c7082514 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.508781] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-721b51a3-3dd7-4ce4-89f3-fb7f218310b7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.544261] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ad5b779-82e9-4f47-9ee5-c31a10e6df06 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.553033] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29a54d6b-52e1-4691-ac1a-ef15cabb7219 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.568974] env[65918]: DEBUG nova.compute.provider_tree [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 691.581836] env[65918]: DEBUG nova.scheduler.client.report [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 691.597030] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65918) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 691.597243] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.407s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.937642] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 691.938183] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 692.426243] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 692.426243] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 700.014573] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Acquiring lock "3b3f8c10-5ba5-445c-a51d-5404874df3d9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.014810] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Lock "3b3f8c10-5ba5-445c-a51d-5404874df3d9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 702.176912] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Acquiring lock "0ccebca0-a1a4-48b2-9154-1c73350dab38" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 702.177291] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Lock "0ccebca0-a1a4-48b2-9154-1c73350dab38" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 703.564024] env[65918]: DEBUG oslo_concurrency.lockutils [None req-502be215-496b-4061-872b-f8d891b6da74 tempest-ServersTestJSON-1292267645 tempest-ServersTestJSON-1292267645-project-member] Acquiring lock "80ab0def-dc17-4708-a606-4671d3e869b0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 703.564253] env[65918]: DEBUG oslo_concurrency.lockutils [None req-502be215-496b-4061-872b-f8d891b6da74 tempest-ServersTestJSON-1292267645 tempest-ServersTestJSON-1292267645-project-member] Lock "80ab0def-dc17-4708-a606-4671d3e869b0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.380233] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Acquiring lock "a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.380411] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Lock "a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 704.772807] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Acquiring lock "c04e5253-0275-4fb3-8eca-6a395c95930f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.773117] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Lock "c04e5253-0275-4fb3-8eca-6a395c95930f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.784822] env[65918]: DEBUG oslo_concurrency.lockutils [None req-41df5eb8-b4fa-4d6e-af83-77fa3c068667 tempest-ServersNegativeTestMultiTenantJSON-1030047277 tempest-ServersNegativeTestMultiTenantJSON-1030047277-project-member] Acquiring lock "190285fc-ed83-417a-90db-b3c94feb4ce3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.785156] env[65918]: DEBUG oslo_concurrency.lockutils [None req-41df5eb8-b4fa-4d6e-af83-77fa3c068667 tempest-ServersNegativeTestMultiTenantJSON-1030047277 tempest-ServersNegativeTestMultiTenantJSON-1030047277-project-member] Lock "190285fc-ed83-417a-90db-b3c94feb4ce3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 708.210731] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2de2001b-6e31-4a3e-ba47-25f1155cd840 tempest-ServersTestMultiNic-1319304383 tempest-ServersTestMultiNic-1319304383-project-member] Acquiring lock "12059024-91af-400d-be6b-36fe9482b22b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 708.211773] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2de2001b-6e31-4a3e-ba47-25f1155cd840 tempest-ServersTestMultiNic-1319304383 tempest-ServersTestMultiNic-1319304383-project-member] Lock "12059024-91af-400d-be6b-36fe9482b22b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 708.707313] env[65918]: DEBUG oslo_concurrency.lockutils [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] Acquiring lock "78576ca1-7755-4532-82ee-de46c9d3a1fc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 708.707313] env[65918]: DEBUG oslo_concurrency.lockutils [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] Lock "78576ca1-7755-4532-82ee-de46c9d3a1fc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 709.661928] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e9bad983-af7e-4d4a-a324-eca345a65263 tempest-ServerMetadataTestJSON-324132329 tempest-ServerMetadataTestJSON-324132329-project-member] Acquiring lock "bac2eb3c-464c-4859-afa6-d7a16ec452a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.661928] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e9bad983-af7e-4d4a-a324-eca345a65263 tempest-ServerMetadataTestJSON-324132329 tempest-ServerMetadataTestJSON-324132329-project-member] Lock "bac2eb3c-464c-4859-afa6-d7a16ec452a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.966975] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2c4ae3ce-20f3-4a08-b1a6-ec9ef3e26f9e tempest-ServersTestJSON-1292267645 tempest-ServersTestJSON-1292267645-project-member] Acquiring lock "a95f80f7-67e0-4f35-a3fb-5ecd02c783ba" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 710.967299] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2c4ae3ce-20f3-4a08-b1a6-ec9ef3e26f9e tempest-ServersTestJSON-1292267645 tempest-ServersTestJSON-1292267645-project-member] Lock "a95f80f7-67e0-4f35-a3fb-5ecd02c783ba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.204931] env[65918]: DEBUG oslo_concurrency.lockutils [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Acquiring lock "89dd139c-4533-4d48-aefa-750086205ad1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.205289] env[65918]: DEBUG oslo_concurrency.lockutils [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Lock "89dd139c-4533-4d48-aefa-750086205ad1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 715.339457] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Acquiring lock "46e1dfe1-df73-430c-85ef-f5753974eed0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.339457] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Lock "46e1dfe1-df73-430c-85ef-f5753974eed0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.770636] env[65918]: WARNING oslo_vmware.rw_handles [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 716.770636] env[65918]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 716.770636] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 716.770636] env[65918]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 716.770636] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 716.770636] env[65918]: ERROR oslo_vmware.rw_handles response.begin() [ 716.770636] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 716.770636] env[65918]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 716.770636] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 716.770636] env[65918]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 716.770636] env[65918]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 716.770636] env[65918]: ERROR oslo_vmware.rw_handles [ 716.771555] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Downloaded image file data e017c336-3a02-4b58-874a-44a1d1e154fd to vmware_temp/211e94cd-a6f1-46ca-b3b9-6e70b6db4f88/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 716.772534] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Caching image {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 716.772771] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Copying Virtual Disk [datastore1] vmware_temp/211e94cd-a6f1-46ca-b3b9-6e70b6db4f88/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk to [datastore1] vmware_temp/211e94cd-a6f1-46ca-b3b9-6e70b6db4f88/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk {{(pid=65918) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 716.775646] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-53f6fddd-f19e-4043-a139-b1acba998d99 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.787310] env[65918]: DEBUG oslo_vmware.api [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Waiting for the task: (returnval){ [ 716.787310] env[65918]: value = "task-2848177" [ 716.787310] env[65918]: _type = "Task" [ 716.787310] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 716.798257] env[65918]: DEBUG oslo_vmware.api [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Task: {'id': task-2848177, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 717.300750] env[65918]: DEBUG oslo_vmware.exceptions [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Fault InvalidArgument not matched. {{(pid=65918) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 717.303363] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 717.303363] env[65918]: ERROR nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 717.303363] env[65918]: Faults: ['InvalidArgument'] [ 717.303363] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Traceback (most recent call last): [ 717.303363] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 717.303363] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] yield resources [ 717.303363] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 717.303363] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] self.driver.spawn(context, instance, image_meta, [ 717.303363] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 717.303363] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 717.303673] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 717.303673] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] self._fetch_image_if_missing(context, vi) [ 717.303673] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 717.303673] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] image_cache(vi, tmp_image_ds_loc) [ 717.303673] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 717.303673] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] vm_util.copy_virtual_disk( [ 717.303673] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 717.303673] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] session._wait_for_task(vmdk_copy_task) [ 717.303673] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 717.303673] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] return self.wait_for_task(task_ref) [ 717.303673] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 717.303673] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] return evt.wait() [ 717.303673] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 717.304010] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] result = hub.switch() [ 717.304010] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 717.304010] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] return self.greenlet.switch() [ 717.304010] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 717.304010] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] self.f(*self.args, **self.kw) [ 717.304010] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 717.304010] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] raise exceptions.translate_fault(task_info.error) [ 717.304010] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 717.304010] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Faults: ['InvalidArgument'] [ 717.304010] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] [ 717.304010] env[65918]: INFO nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Terminating instance [ 717.305168] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 717.305366] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 717.305989] env[65918]: DEBUG nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 717.306697] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 717.306924] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7e261bc2-c6ca-480d-87a6-439843973aa4 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.310945] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a847d072-c4c3-4d6d-866a-a88874751db7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.323715] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 717.323715] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-57555e1a-d06a-4e9d-ad68-91398714a892 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.323959] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 717.323959] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 717.325182] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-88b9a18a-415e-41a8-9be8-a367b0f344a9 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.335121] env[65918]: DEBUG oslo_vmware.api [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Waiting for the task: (returnval){ [ 717.335121] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52f41d2a-d1ec-72e5-c72c-640242d11120" [ 717.335121] env[65918]: _type = "Task" [ 717.335121] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 717.348696] env[65918]: DEBUG oslo_vmware.api [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52f41d2a-d1ec-72e5-c72c-640242d11120, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 717.396641] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 717.396641] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 717.396641] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Deleting the datastore file [datastore1] d34229ba-b110-41aa-b68f-e2d107fd817e {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 717.396641] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9d9966bc-b124-4501-ae89-cd53c6a30c64 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.405589] env[65918]: DEBUG oslo_vmware.api [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Waiting for the task: (returnval){ [ 717.405589] env[65918]: value = "task-2848179" [ 717.405589] env[65918]: _type = "Task" [ 717.405589] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 717.417905] env[65918]: DEBUG oslo_vmware.api [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Task: {'id': task-2848179, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 717.615640] env[65918]: DEBUG oslo_concurrency.lockutils [None req-79c3c05b-9782-4d61-b5ee-7c85bd1cee6b tempest-DeleteServersTestJSON-2103343594 tempest-DeleteServersTestJSON-2103343594-project-member] Acquiring lock "c07fe815-199c-41e8-b102-d2ffae0bb12c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 717.615805] env[65918]: DEBUG oslo_concurrency.lockutils [None req-79c3c05b-9782-4d61-b5ee-7c85bd1cee6b tempest-DeleteServersTestJSON-2103343594 tempest-DeleteServersTestJSON-2103343594-project-member] Lock "c07fe815-199c-41e8-b102-d2ffae0bb12c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 717.851613] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 717.852054] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Creating directory with path [datastore1] vmware_temp/d4ebbe06-09fd-4dfb-be30-4cba106b1056/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 717.852311] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-113c0727-a3e3-40f6-9318-de899779e5f4 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.865337] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Created directory with path [datastore1] vmware_temp/d4ebbe06-09fd-4dfb-be30-4cba106b1056/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 717.865337] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Fetch image to [datastore1] vmware_temp/d4ebbe06-09fd-4dfb-be30-4cba106b1056/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 717.865337] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/d4ebbe06-09fd-4dfb-be30-4cba106b1056/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 717.865963] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-064df87c-0d8f-4941-abda-b0b367ddab81 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.876680] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b531bab-e809-4a89-9387-9e03afb250db {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.888019] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7f63cb8-d90b-4765-ba86-68e95421cbcc {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.927736] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b9ccfb0-c87b-4c4d-8c72-6039d60b5f47 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.936260] env[65918]: DEBUG oslo_vmware.api [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Task: {'id': task-2848179, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079191} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 717.938319] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 717.938714] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 717.939070] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 717.939396] env[65918]: INFO nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Took 0.63 seconds to destroy the instance on the hypervisor. [ 717.941781] env[65918]: DEBUG nova.compute.claims [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 717.942092] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 717.942501] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 717.945172] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0061b48a-0d28-497e-8b29-97ba7afa6934 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.037951] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 718.095201] env[65918]: DEBUG oslo_vmware.rw_handles [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d4ebbe06-09fd-4dfb-be30-4cba106b1056/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 718.152916] env[65918]: DEBUG oslo_vmware.rw_handles [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Completed reading data from the image iterator. {{(pid=65918) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 718.153151] env[65918]: DEBUG oslo_vmware.rw_handles [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d4ebbe06-09fd-4dfb-be30-4cba106b1056/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 718.383743] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5074a3a3-465d-45d7-ae31-24a2002bc2ba {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.393010] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74b78208-ba2a-4601-90bd-a3d97931b9a6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.430815] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-135432f9-33c1-4e04-9311-9a1dd70254ee {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.439207] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b594a2d-9754-4105-a05e-6fb7688a9d00 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.453932] env[65918]: DEBUG nova.compute.provider_tree [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 718.466082] env[65918]: DEBUG nova.scheduler.client.report [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 718.485095] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.542s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.485590] env[65918]: ERROR nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 718.485590] env[65918]: Faults: ['InvalidArgument'] [ 718.485590] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Traceback (most recent call last): [ 718.485590] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 718.485590] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] self.driver.spawn(context, instance, image_meta, [ 718.485590] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 718.485590] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 718.485590] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 718.485590] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] self._fetch_image_if_missing(context, vi) [ 718.485590] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 718.485590] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] image_cache(vi, tmp_image_ds_loc) [ 718.485590] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 718.485866] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] vm_util.copy_virtual_disk( [ 718.485866] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 718.485866] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] session._wait_for_task(vmdk_copy_task) [ 718.485866] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 718.485866] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] return self.wait_for_task(task_ref) [ 718.485866] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 718.485866] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] return evt.wait() [ 718.485866] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 718.485866] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] result = hub.switch() [ 718.485866] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 718.485866] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] return self.greenlet.switch() [ 718.485866] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 718.485866] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] self.f(*self.args, **self.kw) [ 718.486147] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 718.486147] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] raise exceptions.translate_fault(task_info.error) [ 718.486147] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 718.486147] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Faults: ['InvalidArgument'] [ 718.486147] env[65918]: ERROR nova.compute.manager [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] [ 718.486408] env[65918]: DEBUG nova.compute.utils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] VimFaultException {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 718.488753] env[65918]: DEBUG nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Build of instance d34229ba-b110-41aa-b68f-e2d107fd817e was re-scheduled: A specified parameter was not correct: fileType [ 718.488753] env[65918]: Faults: ['InvalidArgument'] {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 718.490386] env[65918]: DEBUG nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 718.490665] env[65918]: DEBUG nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 718.490883] env[65918]: DEBUG nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 718.491248] env[65918]: DEBUG nova.network.neutron [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 718.940905] env[65918]: DEBUG nova.network.neutron [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.965964] env[65918]: INFO nova.compute.manager [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] [instance: d34229ba-b110-41aa-b68f-e2d107fd817e] Took 0.47 seconds to deallocate network for instance. [ 719.090278] env[65918]: INFO nova.scheduler.client.report [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Deleted allocations for instance d34229ba-b110-41aa-b68f-e2d107fd817e [ 719.116039] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2e81d69d-1dd1-4bdd-b858-918fd46beb82 tempest-ServerDiagnosticsNegativeTest-2089491686 tempest-ServerDiagnosticsNegativeTest-2089491686-project-member] Lock "d34229ba-b110-41aa-b68f-e2d107fd817e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 101.747s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 719.149810] env[65918]: DEBUG nova.compute.manager [None req-9037c31d-c63c-44a3-9628-15ae8133f408 tempest-ServersAdminTestJSON-1207677598 tempest-ServersAdminTestJSON-1207677598-project-member] [instance: cda52e82-61cf-4cb4-a37a-61684af013dd] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 719.180785] env[65918]: DEBUG nova.compute.manager [None req-9037c31d-c63c-44a3-9628-15ae8133f408 tempest-ServersAdminTestJSON-1207677598 tempest-ServersAdminTestJSON-1207677598-project-member] [instance: cda52e82-61cf-4cb4-a37a-61684af013dd] Instance disappeared before build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 719.215608] env[65918]: DEBUG oslo_concurrency.lockutils [None req-9037c31d-c63c-44a3-9628-15ae8133f408 tempest-ServersAdminTestJSON-1207677598 tempest-ServersAdminTestJSON-1207677598-project-member] Lock "cda52e82-61cf-4cb4-a37a-61684af013dd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 81.881s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 719.227900] env[65918]: DEBUG nova.compute.manager [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 719.297942] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 719.298257] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 719.299929] env[65918]: INFO nova.compute.claims [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 719.698895] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b395536a-bcaa-4bbd-91ac-31ebd762c06d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.709263] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52210037-af61-46bf-bb77-79bb6d129ad2 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.743055] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67f3d07d-cae5-4a31-9446-90552a03abd0 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.750881] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42c5bbda-b26c-4443-b5e8-95c869f52f84 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.766705] env[65918]: DEBUG nova.compute.provider_tree [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 719.775633] env[65918]: DEBUG nova.scheduler.client.report [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 719.796038] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.498s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 719.796551] env[65918]: DEBUG nova.compute.manager [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 719.831935] env[65918]: DEBUG nova.compute.utils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 719.833246] env[65918]: DEBUG nova.compute.manager [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 719.833419] env[65918]: DEBUG nova.network.neutron [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 719.846909] env[65918]: DEBUG nova.compute.manager [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 719.931204] env[65918]: DEBUG nova.compute.manager [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 719.958950] env[65918]: DEBUG nova.virt.hardware [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 719.959269] env[65918]: DEBUG nova.virt.hardware [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 719.959367] env[65918]: DEBUG nova.virt.hardware [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 719.959552] env[65918]: DEBUG nova.virt.hardware [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 719.959698] env[65918]: DEBUG nova.virt.hardware [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 719.959846] env[65918]: DEBUG nova.virt.hardware [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 719.960062] env[65918]: DEBUG nova.virt.hardware [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 719.960224] env[65918]: DEBUG nova.virt.hardware [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 719.960390] env[65918]: DEBUG nova.virt.hardware [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 719.960551] env[65918]: DEBUG nova.virt.hardware [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 719.960759] env[65918]: DEBUG nova.virt.hardware [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 719.961634] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0aea7f5-5513-44ad-837a-3ceb85cf6d17 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.966519] env[65918]: DEBUG nova.policy [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aeea00fafcd34b34b8706005b3da15de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0e9bba0ff94f491a9c4e7f8e7b32c4d8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 719.977126] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aa7c158-9301-4584-80f3-6328282f4435 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.464196] env[65918]: DEBUG nova.network.neutron [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Successfully created port: e22ef964-3212-4cc4-b343-cd2af9c8a17d {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 721.415166] env[65918]: DEBUG oslo_concurrency.lockutils [None req-83c5f202-7d7d-4f1a-83b6-697f2b8e6f7f tempest-MultipleCreateTestJSON-1481152747 tempest-MultipleCreateTestJSON-1481152747-project-member] Acquiring lock "6af69c4f-4822-4170-94a0-cdc587c825f7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.415514] env[65918]: DEBUG oslo_concurrency.lockutils [None req-83c5f202-7d7d-4f1a-83b6-697f2b8e6f7f tempest-MultipleCreateTestJSON-1481152747 tempest-MultipleCreateTestJSON-1481152747-project-member] Lock "6af69c4f-4822-4170-94a0-cdc587c825f7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.436764] env[65918]: DEBUG nova.network.neutron [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Successfully updated port: e22ef964-3212-4cc4-b343-cd2af9c8a17d {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 721.450566] env[65918]: DEBUG oslo_concurrency.lockutils [None req-83c5f202-7d7d-4f1a-83b6-697f2b8e6f7f tempest-MultipleCreateTestJSON-1481152747 tempest-MultipleCreateTestJSON-1481152747-project-member] Acquiring lock "efe8bbf3-f76d-4509-85d1-ffff559358b5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.450848] env[65918]: DEBUG oslo_concurrency.lockutils [None req-83c5f202-7d7d-4f1a-83b6-697f2b8e6f7f tempest-MultipleCreateTestJSON-1481152747 tempest-MultipleCreateTestJSON-1481152747-project-member] Lock "efe8bbf3-f76d-4509-85d1-ffff559358b5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.455152] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Acquiring lock "refresh_cache-3b3f8c10-5ba5-445c-a51d-5404874df3d9" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 721.455223] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Acquired lock "refresh_cache-3b3f8c10-5ba5-445c-a51d-5404874df3d9" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 721.455323] env[65918]: DEBUG nova.network.neutron [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 721.495906] env[65918]: DEBUG nova.network.neutron [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 721.722369] env[65918]: DEBUG nova.network.neutron [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Updating instance_info_cache with network_info: [{"id": "e22ef964-3212-4cc4-b343-cd2af9c8a17d", "address": "fa:16:3e:b3:99:b9", "network": {"id": "f83c5645-7cec-4eaf-8208-f38a93be2d65", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1516857676-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0e9bba0ff94f491a9c4e7f8e7b32c4d8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "352165bb-004f-4180-9627-3a275dbe18af", "external-id": "nsx-vlan-transportzone-926", "segmentation_id": 926, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape22ef964-32", "ovs_interfaceid": "e22ef964-3212-4cc4-b343-cd2af9c8a17d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.736026] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Releasing lock "refresh_cache-3b3f8c10-5ba5-445c-a51d-5404874df3d9" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 721.736026] env[65918]: DEBUG nova.compute.manager [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Instance network_info: |[{"id": "e22ef964-3212-4cc4-b343-cd2af9c8a17d", "address": "fa:16:3e:b3:99:b9", "network": {"id": "f83c5645-7cec-4eaf-8208-f38a93be2d65", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1516857676-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0e9bba0ff94f491a9c4e7f8e7b32c4d8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "352165bb-004f-4180-9627-3a275dbe18af", "external-id": "nsx-vlan-transportzone-926", "segmentation_id": 926, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape22ef964-32", "ovs_interfaceid": "e22ef964-3212-4cc4-b343-cd2af9c8a17d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 721.736201] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b3:99:b9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '352165bb-004f-4180-9627-3a275dbe18af', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e22ef964-3212-4cc4-b343-cd2af9c8a17d', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 721.745389] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Creating folder: Project (0e9bba0ff94f491a9c4e7f8e7b32c4d8). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 721.746347] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-da4253b6-5a86-4c33-a175-ff3c755435d3 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.757450] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Created folder: Project (0e9bba0ff94f491a9c4e7f8e7b32c4d8) in parent group-v572679. [ 721.757958] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Creating folder: Instances. Parent ref: group-v572715. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 721.758236] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f99becc3-4b6e-4a08-80aa-eaf4ea506f01 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.767490] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Created folder: Instances in parent group-v572715. [ 721.767737] env[65918]: DEBUG oslo.service.loopingcall [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 721.768194] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 721.768194] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8626877d-95d8-452e-a6b6-fb26db4b7320 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.788396] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 721.788396] env[65918]: value = "task-2848182" [ 721.788396] env[65918]: _type = "Task" [ 721.788396] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 721.798163] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848182, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 722.099039] env[65918]: DEBUG nova.compute.manager [req-f63e0c8a-0aa6-49c1-b303-6e5cb4cd28ff req-f0297563-2600-41dd-92de-564a3006e7ad service nova] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Received event network-vif-plugged-e22ef964-3212-4cc4-b343-cd2af9c8a17d {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 722.099039] env[65918]: DEBUG oslo_concurrency.lockutils [req-f63e0c8a-0aa6-49c1-b303-6e5cb4cd28ff req-f0297563-2600-41dd-92de-564a3006e7ad service nova] Acquiring lock "3b3f8c10-5ba5-445c-a51d-5404874df3d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.099039] env[65918]: DEBUG oslo_concurrency.lockutils [req-f63e0c8a-0aa6-49c1-b303-6e5cb4cd28ff req-f0297563-2600-41dd-92de-564a3006e7ad service nova] Lock "3b3f8c10-5ba5-445c-a51d-5404874df3d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 722.099039] env[65918]: DEBUG oslo_concurrency.lockutils [req-f63e0c8a-0aa6-49c1-b303-6e5cb4cd28ff req-f0297563-2600-41dd-92de-564a3006e7ad service nova] Lock "3b3f8c10-5ba5-445c-a51d-5404874df3d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 722.099197] env[65918]: DEBUG nova.compute.manager [req-f63e0c8a-0aa6-49c1-b303-6e5cb4cd28ff req-f0297563-2600-41dd-92de-564a3006e7ad service nova] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] No waiting events found dispatching network-vif-plugged-e22ef964-3212-4cc4-b343-cd2af9c8a17d {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 722.099197] env[65918]: WARNING nova.compute.manager [req-f63e0c8a-0aa6-49c1-b303-6e5cb4cd28ff req-f0297563-2600-41dd-92de-564a3006e7ad service nova] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Received unexpected event network-vif-plugged-e22ef964-3212-4cc4-b343-cd2af9c8a17d for instance with vm_state building and task_state spawning. [ 722.299782] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848182, 'name': CreateVM_Task} progress is 25%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 722.801955] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848182, 'name': CreateVM_Task, 'duration_secs': 0.954029} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 722.803068] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 722.803143] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 722.803837] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 722.803906] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 722.804170] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-169415f1-d425-4678-8493-5567783d7491 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.810236] env[65918]: DEBUG oslo_vmware.api [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Waiting for the task: (returnval){ [ 722.810236] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]529ccdd5-1de6-44e7-a5c1-a23bea476517" [ 722.810236] env[65918]: _type = "Task" [ 722.810236] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 722.820178] env[65918]: DEBUG oslo_vmware.api [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]529ccdd5-1de6-44e7-a5c1-a23bea476517, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 722.829729] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f0b21c67-50f6-4525-b223-8e361de16751 tempest-SecurityGroupsTestJSON-607972084 tempest-SecurityGroupsTestJSON-607972084-project-member] Acquiring lock "1e1d75b1-b4c3-4c72-a8e9-4e2f1b5103d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.829983] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f0b21c67-50f6-4525-b223-8e361de16751 tempest-SecurityGroupsTestJSON-607972084 tempest-SecurityGroupsTestJSON-607972084-project-member] Lock "1e1d75b1-b4c3-4c72-a8e9-4e2f1b5103d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.325380] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 723.325653] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 723.326045] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 724.306386] env[65918]: DEBUG nova.compute.manager [req-9bba23c1-e640-4d9c-8079-499590052a6e req-4217300e-6269-4e4d-94d9-d036711e7b6d service nova] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Received event network-changed-e22ef964-3212-4cc4-b343-cd2af9c8a17d {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 724.306386] env[65918]: DEBUG nova.compute.manager [req-9bba23c1-e640-4d9c-8079-499590052a6e req-4217300e-6269-4e4d-94d9-d036711e7b6d service nova] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Refreshing instance network info cache due to event network-changed-e22ef964-3212-4cc4-b343-cd2af9c8a17d. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 724.306386] env[65918]: DEBUG oslo_concurrency.lockutils [req-9bba23c1-e640-4d9c-8079-499590052a6e req-4217300e-6269-4e4d-94d9-d036711e7b6d service nova] Acquiring lock "refresh_cache-3b3f8c10-5ba5-445c-a51d-5404874df3d9" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 724.306386] env[65918]: DEBUG oslo_concurrency.lockutils [req-9bba23c1-e640-4d9c-8079-499590052a6e req-4217300e-6269-4e4d-94d9-d036711e7b6d service nova] Acquired lock "refresh_cache-3b3f8c10-5ba5-445c-a51d-5404874df3d9" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 724.306386] env[65918]: DEBUG nova.network.neutron [req-9bba23c1-e640-4d9c-8079-499590052a6e req-4217300e-6269-4e4d-94d9-d036711e7b6d service nova] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Refreshing network info cache for port e22ef964-3212-4cc4-b343-cd2af9c8a17d {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 724.693443] env[65918]: DEBUG nova.network.neutron [req-9bba23c1-e640-4d9c-8079-499590052a6e req-4217300e-6269-4e4d-94d9-d036711e7b6d service nova] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Updated VIF entry in instance network info cache for port e22ef964-3212-4cc4-b343-cd2af9c8a17d. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 724.693443] env[65918]: DEBUG nova.network.neutron [req-9bba23c1-e640-4d9c-8079-499590052a6e req-4217300e-6269-4e4d-94d9-d036711e7b6d service nova] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Updating instance_info_cache with network_info: [{"id": "e22ef964-3212-4cc4-b343-cd2af9c8a17d", "address": "fa:16:3e:b3:99:b9", "network": {"id": "f83c5645-7cec-4eaf-8208-f38a93be2d65", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1516857676-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0e9bba0ff94f491a9c4e7f8e7b32c4d8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "352165bb-004f-4180-9627-3a275dbe18af", "external-id": "nsx-vlan-transportzone-926", "segmentation_id": 926, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape22ef964-32", "ovs_interfaceid": "e22ef964-3212-4cc4-b343-cd2af9c8a17d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 724.703619] env[65918]: DEBUG oslo_concurrency.lockutils [req-9bba23c1-e640-4d9c-8079-499590052a6e req-4217300e-6269-4e4d-94d9-d036711e7b6d service nova] Releasing lock "refresh_cache-3b3f8c10-5ba5-445c-a51d-5404874df3d9" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 726.133234] env[65918]: DEBUG oslo_concurrency.lockutils [None req-399b6463-c818-424e-806d-f644084c30ef tempest-FloatingIPsAssociationTestJSON-1535304117 tempest-FloatingIPsAssociationTestJSON-1535304117-project-member] Acquiring lock "b94157e3-2da8-4709-b1cf-b2bb14e0a6f3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 726.133542] env[65918]: DEBUG oslo_concurrency.lockutils [None req-399b6463-c818-424e-806d-f644084c30ef tempest-FloatingIPsAssociationTestJSON-1535304117 tempest-FloatingIPsAssociationTestJSON-1535304117-project-member] Lock "b94157e3-2da8-4709-b1cf-b2bb14e0a6f3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 727.336758] env[65918]: DEBUG oslo_concurrency.lockutils [None req-671abe19-9d2a-44a0-8b32-4e8a272e058d tempest-ImagesOneServerNegativeTestJSON-561303778 tempest-ImagesOneServerNegativeTestJSON-561303778-project-member] Acquiring lock "529e8acc-2775-4eac-8e99-7e901a08f1d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 727.337054] env[65918]: DEBUG oslo_concurrency.lockutils [None req-671abe19-9d2a-44a0-8b32-4e8a272e058d tempest-ImagesOneServerNegativeTestJSON-561303778 tempest-ImagesOneServerNegativeTestJSON-561303778-project-member] Lock "529e8acc-2775-4eac-8e99-7e901a08f1d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 732.846770] env[65918]: DEBUG oslo_concurrency.lockutils [None req-13b00a8c-773b-4b30-9aa7-7392da7c463a tempest-ServerPasswordTestJSON-953704500 tempest-ServerPasswordTestJSON-953704500-project-member] Acquiring lock "f26e4561-b450-4582-a415-a90a4dda7837" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 732.847079] env[65918]: DEBUG oslo_concurrency.lockutils [None req-13b00a8c-773b-4b30-9aa7-7392da7c463a tempest-ServerPasswordTestJSON-953704500 tempest-ServerPasswordTestJSON-953704500-project-member] Lock "f26e4561-b450-4582-a415-a90a4dda7837" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 750.421788] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 750.424552] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 750.424552] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Starting heal instance info cache {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 750.424552] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Rebuilding the list of instances to heal {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 750.443238] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.443455] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.443650] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.443838] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.443984] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.444130] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.444256] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.444377] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.444504] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.444621] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.444738] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Didn't find any instances for network info cache update. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 750.445185] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 750.445329] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65918) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 750.445476] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager.update_available_resource {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 750.454669] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 750.454868] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 750.455051] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 750.455207] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65918) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 750.456229] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a0c8817-5bc4-43f3-ab78-6c7370ad8b6c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.467039] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9119c724-5371-469f-bc3b-55b8d8dfa73a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.481355] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c07528dd-3158-4a8a-ae57-954dadde9a29 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.487992] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-059cb34c-44f2-4cde-a385-9d15dcd6860d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 750.516987] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181052MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65918) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 750.517171] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 750.517403] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 750.586043] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 169c3642-1229-4c49-9e04-67e4e1764286 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.586043] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance cf0087c7-22d0-4317-a00a-73967ccafeaa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.586043] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 7bc8087e-17e1-4cbd-84be-bd6c07e104ce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.586043] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 5c83d7da-f63b-40b7-a1aa-916ba9343439 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.586265] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bd0158bd-e255-4680-b00e-81eb1ce88ad5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.586265] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 51163f89-c8b6-48a8-bbbe-de63c44d92a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.586265] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.586391] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 934b5745-6c9a-4d21-92b8-7505a170e600 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.586547] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bba6f3d9-1be3-4048-86d5-f435511b0fc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.586699] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 3b3f8c10-5ba5-445c-a51d-5404874df3d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 750.598553] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 0ccebca0-a1a4-48b2-9154-1c73350dab38 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.624143] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.636562] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance c04e5253-0275-4fb3-8eca-6a395c95930f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.646999] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 190285fc-ed83-417a-90db-b3c94feb4ce3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.656855] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 12059024-91af-400d-be6b-36fe9482b22b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.666028] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 78576ca1-7755-4532-82ee-de46c9d3a1fc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.675851] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bac2eb3c-464c-4859-afa6-d7a16ec452a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.686357] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance a95f80f7-67e0-4f35-a3fb-5ecd02c783ba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.699634] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 89dd139c-4533-4d48-aefa-750086205ad1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.710315] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 46e1dfe1-df73-430c-85ef-f5753974eed0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.722922] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance c07fe815-199c-41e8-b102-d2ffae0bb12c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.737755] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 6af69c4f-4822-4170-94a0-cdc587c825f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.749309] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance efe8bbf3-f76d-4509-85d1-ffff559358b5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.764992] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 1e1d75b1-b4c3-4c72-a8e9-4e2f1b5103d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.774505] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance b94157e3-2da8-4709-b1cf-b2bb14e0a6f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.784453] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 529e8acc-2775-4eac-8e99-7e901a08f1d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.795255] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance f26e4561-b450-4582-a415-a90a4dda7837 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 750.797015] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 750.797015] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 751.086715] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c537afb-ca39-4ccd-abe3-6b96cd1424ff {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.094641] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71190883-9db2-4a42-944c-03e4d0b1b0f8 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.126056] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-803b4dd3-45df-4549-bb90-fe91f86a888d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.133275] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f1649e8-de65-4920-ab3b-8e69cf074672 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.147020] env[65918]: DEBUG nova.compute.provider_tree [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 751.157419] env[65918]: DEBUG nova.scheduler.client.report [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 751.172163] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65918) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 751.172163] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.655s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 752.150509] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 752.150868] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 752.426209] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 753.423608] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 754.424449] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 764.774813] env[65918]: WARNING oslo_vmware.rw_handles [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 764.774813] env[65918]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 764.774813] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 764.774813] env[65918]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 764.774813] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 764.774813] env[65918]: ERROR oslo_vmware.rw_handles response.begin() [ 764.774813] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 764.774813] env[65918]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 764.774813] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 764.774813] env[65918]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 764.774813] env[65918]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 764.774813] env[65918]: ERROR oslo_vmware.rw_handles [ 764.775421] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Downloaded image file data e017c336-3a02-4b58-874a-44a1d1e154fd to vmware_temp/d4ebbe06-09fd-4dfb-be30-4cba106b1056/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 764.777138] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Caching image {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 764.777420] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Copying Virtual Disk [datastore1] vmware_temp/d4ebbe06-09fd-4dfb-be30-4cba106b1056/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk to [datastore1] vmware_temp/d4ebbe06-09fd-4dfb-be30-4cba106b1056/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk {{(pid=65918) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 764.777719] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c5a366f9-d81c-422e-a913-fe7ba50f4ea7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 764.785958] env[65918]: DEBUG oslo_vmware.api [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Waiting for the task: (returnval){ [ 764.785958] env[65918]: value = "task-2848183" [ 764.785958] env[65918]: _type = "Task" [ 764.785958] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 764.794304] env[65918]: DEBUG oslo_vmware.api [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Task: {'id': task-2848183, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 765.295933] env[65918]: DEBUG oslo_vmware.exceptions [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Fault InvalidArgument not matched. {{(pid=65918) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 765.296211] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 765.296758] env[65918]: ERROR nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 765.296758] env[65918]: Faults: ['InvalidArgument'] [ 765.296758] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Traceback (most recent call last): [ 765.296758] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 765.296758] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] yield resources [ 765.296758] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 765.296758] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] self.driver.spawn(context, instance, image_meta, [ 765.296758] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 765.296758] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] self._vmops.spawn(context, instance, image_meta, injected_files, [ 765.296758] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 765.296758] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] self._fetch_image_if_missing(context, vi) [ 765.296758] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 765.297105] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] image_cache(vi, tmp_image_ds_loc) [ 765.297105] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 765.297105] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] vm_util.copy_virtual_disk( [ 765.297105] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 765.297105] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] session._wait_for_task(vmdk_copy_task) [ 765.297105] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 765.297105] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] return self.wait_for_task(task_ref) [ 765.297105] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 765.297105] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] return evt.wait() [ 765.297105] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 765.297105] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] result = hub.switch() [ 765.297105] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 765.297105] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] return self.greenlet.switch() [ 765.297484] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 765.297484] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] self.f(*self.args, **self.kw) [ 765.297484] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 765.297484] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] raise exceptions.translate_fault(task_info.error) [ 765.297484] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 765.297484] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Faults: ['InvalidArgument'] [ 765.297484] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] [ 765.297484] env[65918]: INFO nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Terminating instance [ 765.298709] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 765.298918] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 765.299165] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3107611e-b230-4c1b-a017-a3b438ef2c44 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.301554] env[65918]: DEBUG nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 765.301801] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 765.302525] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08d64cb2-8e9a-4e76-a6f7-d35a38149670 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.308992] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 765.309240] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-03bcb254-b8e5-4d45-aa5c-5df2831e2d25 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.311499] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 765.311701] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 765.312674] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-19808e59-a961-4641-affc-2ba85269c2ab {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.318541] env[65918]: DEBUG oslo_vmware.api [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Waiting for the task: (returnval){ [ 765.318541] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52a88380-fce4-34cb-8853-816284230128" [ 765.318541] env[65918]: _type = "Task" [ 765.318541] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 765.329979] env[65918]: DEBUG oslo_vmware.api [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52a88380-fce4-34cb-8853-816284230128, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 765.376060] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 765.376060] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 765.376235] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Deleting the datastore file [datastore1] 169c3642-1229-4c49-9e04-67e4e1764286 {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 765.376485] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7a3ae303-f34c-4783-b26d-98e4446c7c26 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.382355] env[65918]: DEBUG oslo_vmware.api [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Waiting for the task: (returnval){ [ 765.382355] env[65918]: value = "task-2848185" [ 765.382355] env[65918]: _type = "Task" [ 765.382355] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 765.389861] env[65918]: DEBUG oslo_vmware.api [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Task: {'id': task-2848185, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 765.829450] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 765.829450] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Creating directory with path [datastore1] vmware_temp/d2fc0795-1c9e-4207-a826-fdb8f00bc21b/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 765.829450] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c4eb8dec-f9b2-4fd9-97f3-09fa5877e4f7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.841145] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Created directory with path [datastore1] vmware_temp/d2fc0795-1c9e-4207-a826-fdb8f00bc21b/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 765.841352] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Fetch image to [datastore1] vmware_temp/d2fc0795-1c9e-4207-a826-fdb8f00bc21b/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 765.841522] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/d2fc0795-1c9e-4207-a826-fdb8f00bc21b/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 765.842336] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f67e200-7679-4b0c-8a31-324b17cd7a2c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.850158] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e3acbba-3895-40d1-954e-6df065865e79 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.859098] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-031feba7-2116-4578-b487-b672f042f19d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.894049] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e240faec-4b12-4c84-b008-0640d292ab91 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.900295] env[65918]: DEBUG oslo_vmware.api [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Task: {'id': task-2848185, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077432} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 765.901783] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 765.901971] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 765.902161] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 765.902330] env[65918]: INFO nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Took 0.60 seconds to destroy the instance on the hypervisor. [ 765.904047] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0bcc6cbb-57d3-4328-b72b-e049ca9c4741 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 765.906380] env[65918]: DEBUG nova.compute.claims [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 765.906535] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 765.906743] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 765.932224] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 765.982094] env[65918]: DEBUG oslo_vmware.rw_handles [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d2fc0795-1c9e-4207-a826-fdb8f00bc21b/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 766.040175] env[65918]: DEBUG oslo_vmware.rw_handles [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Completed reading data from the image iterator. {{(pid=65918) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 766.040363] env[65918]: DEBUG oslo_vmware.rw_handles [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d2fc0795-1c9e-4207-a826-fdb8f00bc21b/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 766.292886] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a06a8122-ebbe-487a-99e3-eb8b79055d11 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.300579] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57a45339-00ef-4fa2-be02-5ae2911bbed1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.329503] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f24478d3-42d0-4226-be3f-9cb743f642e8 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.336540] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b89a22c4-715c-4d57-bb9e-4827fb61f2cb {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.349432] env[65918]: DEBUG nova.compute.provider_tree [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 766.358244] env[65918]: DEBUG nova.scheduler.client.report [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 766.371805] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.465s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 766.372399] env[65918]: ERROR nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 766.372399] env[65918]: Faults: ['InvalidArgument'] [ 766.372399] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Traceback (most recent call last): [ 766.372399] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 766.372399] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] self.driver.spawn(context, instance, image_meta, [ 766.372399] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 766.372399] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] self._vmops.spawn(context, instance, image_meta, injected_files, [ 766.372399] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 766.372399] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] self._fetch_image_if_missing(context, vi) [ 766.372399] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 766.372399] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] image_cache(vi, tmp_image_ds_loc) [ 766.372399] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 766.372686] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] vm_util.copy_virtual_disk( [ 766.372686] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 766.372686] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] session._wait_for_task(vmdk_copy_task) [ 766.372686] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 766.372686] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] return self.wait_for_task(task_ref) [ 766.372686] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 766.372686] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] return evt.wait() [ 766.372686] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 766.372686] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] result = hub.switch() [ 766.372686] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 766.372686] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] return self.greenlet.switch() [ 766.372686] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 766.372686] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] self.f(*self.args, **self.kw) [ 766.372980] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 766.372980] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] raise exceptions.translate_fault(task_info.error) [ 766.372980] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 766.372980] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Faults: ['InvalidArgument'] [ 766.372980] env[65918]: ERROR nova.compute.manager [instance: 169c3642-1229-4c49-9e04-67e4e1764286] [ 766.373403] env[65918]: DEBUG nova.compute.utils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] VimFaultException {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 766.374699] env[65918]: DEBUG nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Build of instance 169c3642-1229-4c49-9e04-67e4e1764286 was re-scheduled: A specified parameter was not correct: fileType [ 766.374699] env[65918]: Faults: ['InvalidArgument'] {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 766.376026] env[65918]: DEBUG nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 766.376026] env[65918]: DEBUG nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 766.376026] env[65918]: DEBUG nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 766.376026] env[65918]: DEBUG nova.network.neutron [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 766.708089] env[65918]: DEBUG nova.network.neutron [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 766.718396] env[65918]: INFO nova.compute.manager [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] [instance: 169c3642-1229-4c49-9e04-67e4e1764286] Took 0.34 seconds to deallocate network for instance. [ 766.810671] env[65918]: INFO nova.scheduler.client.report [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Deleted allocations for instance 169c3642-1229-4c49-9e04-67e4e1764286 [ 766.826813] env[65918]: DEBUG oslo_concurrency.lockutils [None req-ab642f67-55a8-4729-8bbb-a844f0db0faa tempest-FloatingIPsAssociationNegativeTestJSON-953818815 tempest-FloatingIPsAssociationNegativeTestJSON-953818815-project-member] Lock "169c3642-1229-4c49-9e04-67e4e1764286" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 147.315s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 766.858155] env[65918]: DEBUG nova.compute.manager [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 766.905760] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 766.906046] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 766.907688] env[65918]: INFO nova.compute.claims [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 767.229525] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e55ba103-4b1a-4f38-8f86-d76157db31bb {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 767.237254] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f556d8b0-afa2-4b8f-9a7d-7e7510d4ce5c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 767.266578] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2ed1c5c-fdbc-4db0-ac50-6f54991e8cb9 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 767.274095] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c30553c-217b-4ad8-8a0a-f42ebc0c6bff {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 767.289050] env[65918]: DEBUG nova.compute.provider_tree [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 767.296196] env[65918]: DEBUG nova.scheduler.client.report [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 767.310029] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.403s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 767.310029] env[65918]: DEBUG nova.compute.manager [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 767.344817] env[65918]: DEBUG nova.compute.utils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 767.345716] env[65918]: DEBUG nova.compute.manager [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 767.345995] env[65918]: DEBUG nova.network.neutron [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 767.355998] env[65918]: DEBUG nova.compute.manager [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 767.416231] env[65918]: DEBUG nova.policy [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd9b089e480b4e888dda38c515bda367', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f7c249e47d0d4c3abcd3a1b1950d675f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 767.427802] env[65918]: DEBUG nova.compute.manager [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 767.447950] env[65918]: DEBUG nova.virt.hardware [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 767.448219] env[65918]: DEBUG nova.virt.hardware [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 767.448376] env[65918]: DEBUG nova.virt.hardware [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 767.448558] env[65918]: DEBUG nova.virt.hardware [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 767.448703] env[65918]: DEBUG nova.virt.hardware [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 767.448850] env[65918]: DEBUG nova.virt.hardware [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 767.449088] env[65918]: DEBUG nova.virt.hardware [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 767.449287] env[65918]: DEBUG nova.virt.hardware [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 767.449457] env[65918]: DEBUG nova.virt.hardware [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 767.449618] env[65918]: DEBUG nova.virt.hardware [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 767.449785] env[65918]: DEBUG nova.virt.hardware [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 767.450903] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98916d4c-fd10-465c-8680-136b997b8207 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 767.458647] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e590601-2f06-412f-8bfc-e5970dfc499b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 767.812207] env[65918]: DEBUG nova.network.neutron [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Successfully created port: fa0df58d-7dc1-4739-857b-a6dacaf24577 {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 768.532128] env[65918]: DEBUG nova.network.neutron [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Successfully updated port: fa0df58d-7dc1-4739-857b-a6dacaf24577 {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 768.545826] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Acquiring lock "refresh_cache-0ccebca0-a1a4-48b2-9154-1c73350dab38" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 768.545970] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Acquired lock "refresh_cache-0ccebca0-a1a4-48b2-9154-1c73350dab38" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 768.546315] env[65918]: DEBUG nova.network.neutron [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 768.593874] env[65918]: DEBUG nova.network.neutron [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 768.834549] env[65918]: DEBUG nova.network.neutron [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Updating instance_info_cache with network_info: [{"id": "fa0df58d-7dc1-4739-857b-a6dacaf24577", "address": "fa:16:3e:df:64:9d", "network": {"id": "e15c0197-eb68-463b-bd73-449b5f84c872", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1578547113-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f7c249e47d0d4c3abcd3a1b1950d675f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d1e1e320-ec56-4fcc-b6e9-30aa210d3b36", "external-id": "nsx-vlan-transportzone-447", "segmentation_id": 447, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfa0df58d-7d", "ovs_interfaceid": "fa0df58d-7dc1-4739-857b-a6dacaf24577", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 768.851147] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Releasing lock "refresh_cache-0ccebca0-a1a4-48b2-9154-1c73350dab38" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 768.851147] env[65918]: DEBUG nova.compute.manager [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Instance network_info: |[{"id": "fa0df58d-7dc1-4739-857b-a6dacaf24577", "address": "fa:16:3e:df:64:9d", "network": {"id": "e15c0197-eb68-463b-bd73-449b5f84c872", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1578547113-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f7c249e47d0d4c3abcd3a1b1950d675f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d1e1e320-ec56-4fcc-b6e9-30aa210d3b36", "external-id": "nsx-vlan-transportzone-447", "segmentation_id": 447, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfa0df58d-7d", "ovs_interfaceid": "fa0df58d-7dc1-4739-857b-a6dacaf24577", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 768.851279] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:df:64:9d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd1e1e320-ec56-4fcc-b6e9-30aa210d3b36', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'fa0df58d-7dc1-4739-857b-a6dacaf24577', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 768.860308] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Creating folder: Project (f7c249e47d0d4c3abcd3a1b1950d675f). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 768.863741] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d4e434a7-f362-464a-8f75-31eadd2da640 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 768.871545] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Created folder: Project (f7c249e47d0d4c3abcd3a1b1950d675f) in parent group-v572679. [ 768.872253] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Creating folder: Instances. Parent ref: group-v572718. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 768.873556] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-316c50b5-e099-4af8-825c-e5444bcd0895 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 768.882024] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Created folder: Instances in parent group-v572718. [ 768.882024] env[65918]: DEBUG oslo.service.loopingcall [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 768.882024] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 768.882304] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5f144a60-ddf7-4a60-a0d8-eada93990901 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 768.903705] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 768.903705] env[65918]: value = "task-2848188" [ 768.903705] env[65918]: _type = "Task" [ 768.903705] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 768.910957] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848188, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 768.917822] env[65918]: DEBUG nova.compute.manager [req-98010bf6-e8fa-4ce2-991d-cc182d05058e req-6e5d80b2-2efa-4a46-80f3-c651464524b5 service nova] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Received event network-vif-plugged-fa0df58d-7dc1-4739-857b-a6dacaf24577 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 768.918050] env[65918]: DEBUG oslo_concurrency.lockutils [req-98010bf6-e8fa-4ce2-991d-cc182d05058e req-6e5d80b2-2efa-4a46-80f3-c651464524b5 service nova] Acquiring lock "0ccebca0-a1a4-48b2-9154-1c73350dab38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 768.918260] env[65918]: DEBUG oslo_concurrency.lockutils [req-98010bf6-e8fa-4ce2-991d-cc182d05058e req-6e5d80b2-2efa-4a46-80f3-c651464524b5 service nova] Lock "0ccebca0-a1a4-48b2-9154-1c73350dab38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 768.918428] env[65918]: DEBUG oslo_concurrency.lockutils [req-98010bf6-e8fa-4ce2-991d-cc182d05058e req-6e5d80b2-2efa-4a46-80f3-c651464524b5 service nova] Lock "0ccebca0-a1a4-48b2-9154-1c73350dab38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 768.918592] env[65918]: DEBUG nova.compute.manager [req-98010bf6-e8fa-4ce2-991d-cc182d05058e req-6e5d80b2-2efa-4a46-80f3-c651464524b5 service nova] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] No waiting events found dispatching network-vif-plugged-fa0df58d-7dc1-4739-857b-a6dacaf24577 {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 768.918752] env[65918]: WARNING nova.compute.manager [req-98010bf6-e8fa-4ce2-991d-cc182d05058e req-6e5d80b2-2efa-4a46-80f3-c651464524b5 service nova] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Received unexpected event network-vif-plugged-fa0df58d-7dc1-4739-857b-a6dacaf24577 for instance with vm_state building and task_state spawning. [ 768.918910] env[65918]: DEBUG nova.compute.manager [req-98010bf6-e8fa-4ce2-991d-cc182d05058e req-6e5d80b2-2efa-4a46-80f3-c651464524b5 service nova] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Received event network-changed-fa0df58d-7dc1-4739-857b-a6dacaf24577 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 768.919274] env[65918]: DEBUG nova.compute.manager [req-98010bf6-e8fa-4ce2-991d-cc182d05058e req-6e5d80b2-2efa-4a46-80f3-c651464524b5 service nova] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Refreshing instance network info cache due to event network-changed-fa0df58d-7dc1-4739-857b-a6dacaf24577. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 768.919491] env[65918]: DEBUG oslo_concurrency.lockutils [req-98010bf6-e8fa-4ce2-991d-cc182d05058e req-6e5d80b2-2efa-4a46-80f3-c651464524b5 service nova] Acquiring lock "refresh_cache-0ccebca0-a1a4-48b2-9154-1c73350dab38" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 768.919628] env[65918]: DEBUG oslo_concurrency.lockutils [req-98010bf6-e8fa-4ce2-991d-cc182d05058e req-6e5d80b2-2efa-4a46-80f3-c651464524b5 service nova] Acquired lock "refresh_cache-0ccebca0-a1a4-48b2-9154-1c73350dab38" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 768.919831] env[65918]: DEBUG nova.network.neutron [req-98010bf6-e8fa-4ce2-991d-cc182d05058e req-6e5d80b2-2efa-4a46-80f3-c651464524b5 service nova] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Refreshing network info cache for port fa0df58d-7dc1-4739-857b-a6dacaf24577 {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 769.414565] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848188, 'name': CreateVM_Task, 'duration_secs': 0.318811} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 769.414997] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 769.416171] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 769.416293] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 769.416604] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 769.416852] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b4f0d45c-c526-4d97-b494-c8dd63a69ff1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.421647] env[65918]: DEBUG oslo_vmware.api [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Waiting for the task: (returnval){ [ 769.421647] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]520f9468-1d4e-17a0-aa83-e14f6a1f6a25" [ 769.421647] env[65918]: _type = "Task" [ 769.421647] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 769.431264] env[65918]: DEBUG oslo_vmware.api [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]520f9468-1d4e-17a0-aa83-e14f6a1f6a25, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 769.434555] env[65918]: DEBUG nova.network.neutron [req-98010bf6-e8fa-4ce2-991d-cc182d05058e req-6e5d80b2-2efa-4a46-80f3-c651464524b5 service nova] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Updated VIF entry in instance network info cache for port fa0df58d-7dc1-4739-857b-a6dacaf24577. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 769.435930] env[65918]: DEBUG nova.network.neutron [req-98010bf6-e8fa-4ce2-991d-cc182d05058e req-6e5d80b2-2efa-4a46-80f3-c651464524b5 service nova] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Updating instance_info_cache with network_info: [{"id": "fa0df58d-7dc1-4739-857b-a6dacaf24577", "address": "fa:16:3e:df:64:9d", "network": {"id": "e15c0197-eb68-463b-bd73-449b5f84c872", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1578547113-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f7c249e47d0d4c3abcd3a1b1950d675f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d1e1e320-ec56-4fcc-b6e9-30aa210d3b36", "external-id": "nsx-vlan-transportzone-447", "segmentation_id": 447, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfa0df58d-7d", "ovs_interfaceid": "fa0df58d-7dc1-4739-857b-a6dacaf24577", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 769.448485] env[65918]: DEBUG oslo_concurrency.lockutils [req-98010bf6-e8fa-4ce2-991d-cc182d05058e req-6e5d80b2-2efa-4a46-80f3-c651464524b5 service nova] Releasing lock "refresh_cache-0ccebca0-a1a4-48b2-9154-1c73350dab38" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 769.933080] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 769.933717] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 769.934079] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 810.421777] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 810.423338] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager.update_available_resource {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 810.432661] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 810.432935] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 810.433124] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 810.433281] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65918) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 810.434453] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1082ae13-6cb1-42b5-97f8-83890f11b7ce {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.443506] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5579ef33-bde6-49c5-9b7a-a44097244a1a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.457205] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-009804bb-9b37-4c5e-a325-b5ba02b81acc {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.463278] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55db4ad3-0ec6-431d-9560-3c34bacb39aa {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.493053] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180998MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65918) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 810.493053] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 810.493235] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 810.560534] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance cf0087c7-22d0-4317-a00a-73967ccafeaa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 810.560757] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 7bc8087e-17e1-4cbd-84be-bd6c07e104ce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 810.560909] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 5c83d7da-f63b-40b7-a1aa-916ba9343439 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 810.561067] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bd0158bd-e255-4680-b00e-81eb1ce88ad5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 810.561207] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 51163f89-c8b6-48a8-bbbe-de63c44d92a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 810.561332] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 810.561451] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 934b5745-6c9a-4d21-92b8-7505a170e600 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 810.561568] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bba6f3d9-1be3-4048-86d5-f435511b0fc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 810.561684] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 3b3f8c10-5ba5-445c-a51d-5404874df3d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 810.561800] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 0ccebca0-a1a4-48b2-9154-1c73350dab38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 810.573666] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.585224] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance c04e5253-0275-4fb3-8eca-6a395c95930f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.595622] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 190285fc-ed83-417a-90db-b3c94feb4ce3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.607843] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 12059024-91af-400d-be6b-36fe9482b22b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.616803] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 78576ca1-7755-4532-82ee-de46c9d3a1fc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.630168] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bac2eb3c-464c-4859-afa6-d7a16ec452a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.639600] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance a95f80f7-67e0-4f35-a3fb-5ecd02c783ba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.649297] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 89dd139c-4533-4d48-aefa-750086205ad1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.661132] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 46e1dfe1-df73-430c-85ef-f5753974eed0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.671396] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance c07fe815-199c-41e8-b102-d2ffae0bb12c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.681129] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 6af69c4f-4822-4170-94a0-cdc587c825f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.690528] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance efe8bbf3-f76d-4509-85d1-ffff559358b5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.700988] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 1e1d75b1-b4c3-4c72-a8e9-4e2f1b5103d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.710418] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance b94157e3-2da8-4709-b1cf-b2bb14e0a6f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.722934] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 529e8acc-2775-4eac-8e99-7e901a08f1d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.732877] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance f26e4561-b450-4582-a415-a90a4dda7837 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 810.733140] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 810.733292] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 811.019138] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f58e118-4b31-4f2a-90e4-5f6d20015aca {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.026884] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1864a0c-57dc-4f2c-b7a0-1133035684ec {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.057163] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75bd0019-d357-471a-9453-018257d23ce2 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.063790] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6a8b945-f78c-40a7-99c3-91bd286c8832 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.076709] env[65918]: DEBUG nova.compute.provider_tree [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 811.084668] env[65918]: DEBUG nova.scheduler.client.report [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 811.097196] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65918) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 811.097370] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.604s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 811.803223] env[65918]: WARNING oslo_vmware.rw_handles [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 811.803223] env[65918]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 811.803223] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 811.803223] env[65918]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 811.803223] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 811.803223] env[65918]: ERROR oslo_vmware.rw_handles response.begin() [ 811.803223] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 811.803223] env[65918]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 811.803223] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 811.803223] env[65918]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 811.803223] env[65918]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 811.803223] env[65918]: ERROR oslo_vmware.rw_handles [ 811.803885] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Downloaded image file data e017c336-3a02-4b58-874a-44a1d1e154fd to vmware_temp/d2fc0795-1c9e-4207-a826-fdb8f00bc21b/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 811.805251] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Caching image {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 811.805505] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Copying Virtual Disk [datastore1] vmware_temp/d2fc0795-1c9e-4207-a826-fdb8f00bc21b/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk to [datastore1] vmware_temp/d2fc0795-1c9e-4207-a826-fdb8f00bc21b/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk {{(pid=65918) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 811.805746] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-99de0d55-bc9e-459e-bc4a-7fb781af0c54 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.813509] env[65918]: DEBUG oslo_vmware.api [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Waiting for the task: (returnval){ [ 811.813509] env[65918]: value = "task-2848189" [ 811.813509] env[65918]: _type = "Task" [ 811.813509] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 811.821422] env[65918]: DEBUG oslo_vmware.api [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Task: {'id': task-2848189, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 812.098876] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 812.098876] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 812.098876] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65918) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 812.323430] env[65918]: DEBUG oslo_vmware.exceptions [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Fault InvalidArgument not matched. {{(pid=65918) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 812.323688] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 812.324337] env[65918]: ERROR nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 812.324337] env[65918]: Faults: ['InvalidArgument'] [ 812.324337] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Traceback (most recent call last): [ 812.324337] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 812.324337] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] yield resources [ 812.324337] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 812.324337] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] self.driver.spawn(context, instance, image_meta, [ 812.324337] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 812.324337] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] self._vmops.spawn(context, instance, image_meta, injected_files, [ 812.324337] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 812.324337] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] self._fetch_image_if_missing(context, vi) [ 812.324337] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 812.324819] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] image_cache(vi, tmp_image_ds_loc) [ 812.324819] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 812.324819] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] vm_util.copy_virtual_disk( [ 812.324819] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 812.324819] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] session._wait_for_task(vmdk_copy_task) [ 812.324819] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 812.324819] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] return self.wait_for_task(task_ref) [ 812.324819] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 812.324819] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] return evt.wait() [ 812.324819] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 812.324819] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] result = hub.switch() [ 812.324819] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 812.324819] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] return self.greenlet.switch() [ 812.325213] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 812.325213] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] self.f(*self.args, **self.kw) [ 812.325213] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 812.325213] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] raise exceptions.translate_fault(task_info.error) [ 812.325213] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 812.325213] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Faults: ['InvalidArgument'] [ 812.325213] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] [ 812.325213] env[65918]: INFO nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Terminating instance [ 812.326133] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 812.326339] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 812.326710] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2d9dde39-2037-461b-8917-7627346c439a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.329462] env[65918]: DEBUG nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 812.329656] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 812.330369] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-726be2a8-cb00-4784-a055-4178ef1eda3d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.337235] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 812.338132] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2d608122-655c-4bcd-968a-36439bcda680 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.339775] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 812.339948] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 812.340996] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3e34bbf6-2416-4728-b725-d599b8edf006 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.346113] env[65918]: DEBUG oslo_vmware.api [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Waiting for the task: (returnval){ [ 812.346113] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52ed75cf-3321-9ec9-48c3-475ff45e6f0f" [ 812.346113] env[65918]: _type = "Task" [ 812.346113] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 812.353381] env[65918]: DEBUG oslo_vmware.api [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52ed75cf-3321-9ec9-48c3-475ff45e6f0f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 812.402969] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 812.403230] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 812.403409] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Deleting the datastore file [datastore1] cf0087c7-22d0-4317-a00a-73967ccafeaa {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 812.403666] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-63561911-8577-4ccc-a5b8-bc173ef8b026 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.409998] env[65918]: DEBUG oslo_vmware.api [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Waiting for the task: (returnval){ [ 812.409998] env[65918]: value = "task-2848191" [ 812.409998] env[65918]: _type = "Task" [ 812.409998] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 812.417889] env[65918]: DEBUG oslo_vmware.api [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Task: {'id': task-2848191, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 812.423341] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 812.423505] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Starting heal instance info cache {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 812.423911] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Rebuilding the list of instances to heal {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 812.445512] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.445847] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.445992] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.446144] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.446275] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.446400] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.446522] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.446644] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.446763] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.447644] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 812.447644] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Didn't find any instances for network info cache update. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 812.447644] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 812.856703] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 812.856984] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Creating directory with path [datastore1] vmware_temp/690480f6-c1f9-4e36-b760-5553e73abf18/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 812.857209] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-916aa5d2-a540-4f2e-9232-a1424fd0ebc5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.869341] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Created directory with path [datastore1] vmware_temp/690480f6-c1f9-4e36-b760-5553e73abf18/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 812.869341] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Fetch image to [datastore1] vmware_temp/690480f6-c1f9-4e36-b760-5553e73abf18/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 812.869341] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/690480f6-c1f9-4e36-b760-5553e73abf18/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 812.870023] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86f5883d-ee79-4359-abc6-73bae58529dc {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.876678] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4ed123f-bd4d-4726-a6fb-68cf6c1b5aff {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.886459] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f1e8071-35b5-47cc-acab-ce844f22bc1e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.921200] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff340abe-14d6-4480-a2fe-207b5eec5476 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.930541] env[65918]: DEBUG oslo_vmware.api [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Task: {'id': task-2848191, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079814} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 812.931108] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 812.931309] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 812.931484] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 812.931657] env[65918]: INFO nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Took 0.60 seconds to destroy the instance on the hypervisor. [ 812.933187] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-728810e6-a357-4838-ac30-582ae24f2f36 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.935148] env[65918]: DEBUG nova.compute.claims [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 812.936046] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 812.936046] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 812.962893] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 813.010864] env[65918]: DEBUG oslo_vmware.rw_handles [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/690480f6-c1f9-4e36-b760-5553e73abf18/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 813.068342] env[65918]: DEBUG oslo_vmware.rw_handles [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Completed reading data from the image iterator. {{(pid=65918) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 813.069030] env[65918]: DEBUG oslo_vmware.rw_handles [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/690480f6-c1f9-4e36-b760-5553e73abf18/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 813.303204] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2f570f7-8fc4-469e-b032-c8f80838c70a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 813.310736] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae08da54-8145-4f91-8d48-e144c0260275 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 813.340623] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afd5ee7a-de2c-42e4-86bd-a44a04bcf930 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 813.347179] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcbb3515-ece6-4595-b391-ad4814d8b4e5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 813.359614] env[65918]: DEBUG nova.compute.provider_tree [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 813.367507] env[65918]: DEBUG nova.scheduler.client.report [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 813.380544] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.445s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 813.381082] env[65918]: ERROR nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 813.381082] env[65918]: Faults: ['InvalidArgument'] [ 813.381082] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Traceback (most recent call last): [ 813.381082] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 813.381082] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] self.driver.spawn(context, instance, image_meta, [ 813.381082] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 813.381082] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] self._vmops.spawn(context, instance, image_meta, injected_files, [ 813.381082] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 813.381082] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] self._fetch_image_if_missing(context, vi) [ 813.381082] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 813.381082] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] image_cache(vi, tmp_image_ds_loc) [ 813.381082] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 813.381401] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] vm_util.copy_virtual_disk( [ 813.381401] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 813.381401] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] session._wait_for_task(vmdk_copy_task) [ 813.381401] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 813.381401] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] return self.wait_for_task(task_ref) [ 813.381401] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 813.381401] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] return evt.wait() [ 813.381401] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 813.381401] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] result = hub.switch() [ 813.381401] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 813.381401] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] return self.greenlet.switch() [ 813.381401] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 813.381401] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] self.f(*self.args, **self.kw) [ 813.381846] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 813.381846] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] raise exceptions.translate_fault(task_info.error) [ 813.381846] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 813.381846] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Faults: ['InvalidArgument'] [ 813.381846] env[65918]: ERROR nova.compute.manager [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] [ 813.381846] env[65918]: DEBUG nova.compute.utils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] VimFaultException {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 813.383109] env[65918]: DEBUG nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Build of instance cf0087c7-22d0-4317-a00a-73967ccafeaa was re-scheduled: A specified parameter was not correct: fileType [ 813.383109] env[65918]: Faults: ['InvalidArgument'] {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 813.383483] env[65918]: DEBUG nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 813.383650] env[65918]: DEBUG nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 813.383801] env[65918]: DEBUG nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 813.383964] env[65918]: DEBUG nova.network.neutron [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 813.703309] env[65918]: DEBUG nova.network.neutron [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 813.715149] env[65918]: INFO nova.compute.manager [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] [instance: cf0087c7-22d0-4317-a00a-73967ccafeaa] Took 0.33 seconds to deallocate network for instance. [ 813.813425] env[65918]: INFO nova.scheduler.client.report [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Deleted allocations for instance cf0087c7-22d0-4317-a00a-73967ccafeaa [ 813.829279] env[65918]: DEBUG oslo_concurrency.lockutils [None req-48ddeea5-ba74-400e-8cd3-cde3f3395e44 tempest-MigrationsAdminTest-305664523 tempest-MigrationsAdminTest-305664523-project-member] Lock "cf0087c7-22d0-4317-a00a-73967ccafeaa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 193.499s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 813.853907] env[65918]: DEBUG nova.compute.manager [None req-502be215-496b-4061-872b-f8d891b6da74 tempest-ServersTestJSON-1292267645 tempest-ServersTestJSON-1292267645-project-member] [instance: 80ab0def-dc17-4708-a606-4671d3e869b0] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 813.880162] env[65918]: DEBUG nova.compute.manager [None req-502be215-496b-4061-872b-f8d891b6da74 tempest-ServersTestJSON-1292267645 tempest-ServersTestJSON-1292267645-project-member] [instance: 80ab0def-dc17-4708-a606-4671d3e869b0] Instance disappeared before build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 813.909145] env[65918]: DEBUG oslo_concurrency.lockutils [None req-502be215-496b-4061-872b-f8d891b6da74 tempest-ServersTestJSON-1292267645 tempest-ServersTestJSON-1292267645-project-member] Lock "80ab0def-dc17-4708-a606-4671d3e869b0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 110.344s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 813.919964] env[65918]: DEBUG nova.compute.manager [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 813.971374] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 813.971587] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 813.973097] env[65918]: INFO nova.compute.claims [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 814.297941] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba382947-e49e-44cd-8ec6-7d709127f884 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 814.305359] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f397bef-e963-49f7-be99-eb8bc26d0362 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 814.335572] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5524a241-e2ce-4ccf-9187-7dfa34a1a016 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 814.342477] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc0ce7c9-2005-469f-98b1-22d6b35db0d4 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 814.355199] env[65918]: DEBUG nova.compute.provider_tree [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 814.363548] env[65918]: DEBUG nova.scheduler.client.report [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 814.377092] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.405s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 814.386105] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Acquiring lock "d2da9f27-0efe-408a-ace9-ac24b11938a5" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 814.386344] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Lock "d2da9f27-0efe-408a-ace9-ac24b11938a5" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 814.390814] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Lock "d2da9f27-0efe-408a-ace9-ac24b11938a5" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: held 0.004s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 814.391266] env[65918]: DEBUG nova.compute.manager [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 814.423178] env[65918]: DEBUG nova.compute.utils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 814.424465] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 814.425594] env[65918]: DEBUG nova.compute.manager [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 814.425761] env[65918]: DEBUG nova.network.neutron [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 814.433624] env[65918]: DEBUG nova.compute.manager [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 814.446173] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 814.446392] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 814.503198] env[65918]: DEBUG nova.compute.manager [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 814.527163] env[65918]: DEBUG nova.policy [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '727301708a1545a8a5437b9a639df6d7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4b244cba57d24b78a22912bfda286414', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 814.532217] env[65918]: DEBUG nova.virt.hardware [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 814.532365] env[65918]: DEBUG nova.virt.hardware [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 814.532444] env[65918]: DEBUG nova.virt.hardware [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 814.532615] env[65918]: DEBUG nova.virt.hardware [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 814.532762] env[65918]: DEBUG nova.virt.hardware [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 814.532909] env[65918]: DEBUG nova.virt.hardware [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 814.533124] env[65918]: DEBUG nova.virt.hardware [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 814.533282] env[65918]: DEBUG nova.virt.hardware [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 814.533442] env[65918]: DEBUG nova.virt.hardware [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 814.533601] env[65918]: DEBUG nova.virt.hardware [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 814.533766] env[65918]: DEBUG nova.virt.hardware [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 814.534631] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-878af730-a5ca-469b-a256-9410980ee473 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 814.542486] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fffda454-d447-4368-9a1c-3595282a0215 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 814.908975] env[65918]: DEBUG nova.network.neutron [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Successfully created port: 4dae9f4f-c480-4bc3-9495-a21cb248ee3d {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 815.423515] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 815.742590] env[65918]: DEBUG nova.compute.manager [req-67fbc197-0038-4d27-a966-f51f6d6384b8 req-c5c4edfa-b447-4f02-addf-e24dc346315b service nova] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Received event network-vif-plugged-4dae9f4f-c480-4bc3-9495-a21cb248ee3d {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 815.742896] env[65918]: DEBUG oslo_concurrency.lockutils [req-67fbc197-0038-4d27-a966-f51f6d6384b8 req-c5c4edfa-b447-4f02-addf-e24dc346315b service nova] Acquiring lock "a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 815.744143] env[65918]: DEBUG oslo_concurrency.lockutils [req-67fbc197-0038-4d27-a966-f51f6d6384b8 req-c5c4edfa-b447-4f02-addf-e24dc346315b service nova] Lock "a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 815.744372] env[65918]: DEBUG oslo_concurrency.lockutils [req-67fbc197-0038-4d27-a966-f51f6d6384b8 req-c5c4edfa-b447-4f02-addf-e24dc346315b service nova] Lock "a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 815.744576] env[65918]: DEBUG nova.compute.manager [req-67fbc197-0038-4d27-a966-f51f6d6384b8 req-c5c4edfa-b447-4f02-addf-e24dc346315b service nova] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] No waiting events found dispatching network-vif-plugged-4dae9f4f-c480-4bc3-9495-a21cb248ee3d {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 815.744769] env[65918]: WARNING nova.compute.manager [req-67fbc197-0038-4d27-a966-f51f6d6384b8 req-c5c4edfa-b447-4f02-addf-e24dc346315b service nova] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Received unexpected event network-vif-plugged-4dae9f4f-c480-4bc3-9495-a21cb248ee3d for instance with vm_state building and task_state spawning. [ 815.824162] env[65918]: DEBUG nova.network.neutron [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Successfully updated port: 4dae9f4f-c480-4bc3-9495-a21cb248ee3d {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 815.831217] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Acquiring lock "refresh_cache-a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 815.831357] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Acquired lock "refresh_cache-a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 815.831506] env[65918]: DEBUG nova.network.neutron [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 815.876356] env[65918]: DEBUG nova.network.neutron [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 816.079345] env[65918]: DEBUG nova.network.neutron [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Updating instance_info_cache with network_info: [{"id": "4dae9f4f-c480-4bc3-9495-a21cb248ee3d", "address": "fa:16:3e:40:04:a5", "network": {"id": "8bf5212b-b322-4050-a9ac-ec3190d71719", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-386197131-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4b244cba57d24b78a22912bfda286414", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13e71dbb-4279-427c-b39d-ba5df9895e58", "external-id": "nsx-vlan-transportzone-417", "segmentation_id": 417, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4dae9f4f-c4", "ovs_interfaceid": "4dae9f4f-c480-4bc3-9495-a21cb248ee3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 816.092588] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Releasing lock "refresh_cache-a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 816.092935] env[65918]: DEBUG nova.compute.manager [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Instance network_info: |[{"id": "4dae9f4f-c480-4bc3-9495-a21cb248ee3d", "address": "fa:16:3e:40:04:a5", "network": {"id": "8bf5212b-b322-4050-a9ac-ec3190d71719", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-386197131-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4b244cba57d24b78a22912bfda286414", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13e71dbb-4279-427c-b39d-ba5df9895e58", "external-id": "nsx-vlan-transportzone-417", "segmentation_id": 417, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4dae9f4f-c4", "ovs_interfaceid": "4dae9f4f-c480-4bc3-9495-a21cb248ee3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 816.093331] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:40:04:a5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '13e71dbb-4279-427c-b39d-ba5df9895e58', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4dae9f4f-c480-4bc3-9495-a21cb248ee3d', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 816.100735] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Creating folder: Project (4b244cba57d24b78a22912bfda286414). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 816.101323] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c82a1f80-a9c9-4ae7-9869-e9dd93ffd02f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.112781] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Created folder: Project (4b244cba57d24b78a22912bfda286414) in parent group-v572679. [ 816.112954] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Creating folder: Instances. Parent ref: group-v572721. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 816.113192] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-30cd91b5-c38e-4a70-adc4-02fd2156dbd7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.122324] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Created folder: Instances in parent group-v572721. [ 816.122550] env[65918]: DEBUG oslo.service.loopingcall [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 816.122729] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 816.122949] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bf7c2c14-ad39-4e37-90a0-05a5b20924ff {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.141884] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 816.141884] env[65918]: value = "task-2848194" [ 816.141884] env[65918]: _type = "Task" [ 816.141884] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 816.149850] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848194, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 816.652301] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848194, 'name': CreateVM_Task, 'duration_secs': 0.314214} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 816.652463] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 816.653141] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 816.653302] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 816.653590] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 816.653829] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9059c0f5-4674-4688-9461-c2c6cbdb3180 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 816.660022] env[65918]: DEBUG oslo_vmware.api [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Waiting for the task: (returnval){ [ 816.660022] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52765e6b-19e9-58f0-f8e9-51c21cbbc101" [ 816.660022] env[65918]: _type = "Task" [ 816.660022] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 816.665740] env[65918]: DEBUG oslo_vmware.api [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52765e6b-19e9-58f0-f8e9-51c21cbbc101, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 817.168261] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 817.168556] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 817.168743] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 817.846121] env[65918]: DEBUG nova.compute.manager [req-78aff7b1-954f-4132-aa1e-a6d1da04a6a6 req-0038b684-4c6d-400a-8585-d1bb0639a6f7 service nova] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Received event network-changed-4dae9f4f-c480-4bc3-9495-a21cb248ee3d {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 817.846121] env[65918]: DEBUG nova.compute.manager [req-78aff7b1-954f-4132-aa1e-a6d1da04a6a6 req-0038b684-4c6d-400a-8585-d1bb0639a6f7 service nova] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Refreshing instance network info cache due to event network-changed-4dae9f4f-c480-4bc3-9495-a21cb248ee3d. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 817.846121] env[65918]: DEBUG oslo_concurrency.lockutils [req-78aff7b1-954f-4132-aa1e-a6d1da04a6a6 req-0038b684-4c6d-400a-8585-d1bb0639a6f7 service nova] Acquiring lock "refresh_cache-a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 817.846121] env[65918]: DEBUG oslo_concurrency.lockutils [req-78aff7b1-954f-4132-aa1e-a6d1da04a6a6 req-0038b684-4c6d-400a-8585-d1bb0639a6f7 service nova] Acquired lock "refresh_cache-a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 817.846121] env[65918]: DEBUG nova.network.neutron [req-78aff7b1-954f-4132-aa1e-a6d1da04a6a6 req-0038b684-4c6d-400a-8585-d1bb0639a6f7 service nova] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Refreshing network info cache for port 4dae9f4f-c480-4bc3-9495-a21cb248ee3d {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 818.179231] env[65918]: DEBUG nova.network.neutron [req-78aff7b1-954f-4132-aa1e-a6d1da04a6a6 req-0038b684-4c6d-400a-8585-d1bb0639a6f7 service nova] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Updated VIF entry in instance network info cache for port 4dae9f4f-c480-4bc3-9495-a21cb248ee3d. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 818.179617] env[65918]: DEBUG nova.network.neutron [req-78aff7b1-954f-4132-aa1e-a6d1da04a6a6 req-0038b684-4c6d-400a-8585-d1bb0639a6f7 service nova] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Updating instance_info_cache with network_info: [{"id": "4dae9f4f-c480-4bc3-9495-a21cb248ee3d", "address": "fa:16:3e:40:04:a5", "network": {"id": "8bf5212b-b322-4050-a9ac-ec3190d71719", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-386197131-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4b244cba57d24b78a22912bfda286414", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13e71dbb-4279-427c-b39d-ba5df9895e58", "external-id": "nsx-vlan-transportzone-417", "segmentation_id": 417, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4dae9f4f-c4", "ovs_interfaceid": "4dae9f4f-c480-4bc3-9495-a21cb248ee3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 818.195222] env[65918]: DEBUG oslo_concurrency.lockutils [req-78aff7b1-954f-4132-aa1e-a6d1da04a6a6 req-0038b684-4c6d-400a-8585-d1bb0639a6f7 service nova] Releasing lock "refresh_cache-a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 822.682633] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Acquiring lock "c9932955-3b82-4c30-9441-b33695340ed2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 822.682917] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Lock "c9932955-3b82-4c30-9441-b33695340ed2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 826.359760] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3e30df49-8ce5-4b43-90a2-b678517e6b00 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquiring lock "bd0158bd-e255-4680-b00e-81eb1ce88ad5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 826.760945] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4c57f0d-a21b-46bf-9f06-d1093ceaaecc tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquiring lock "5c83d7da-f63b-40b7-a1aa-916ba9343439" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 829.981309] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b5b4e98e-5beb-4767-b693-6ae11076c6f2 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Acquiring lock "32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 830.380418] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5c76c5ee-e848-4309-8d04-bdef433be87a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Acquiring lock "51163f89-c8b6-48a8-bbbe-de63c44d92a5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 832.128839] env[65918]: DEBUG oslo_concurrency.lockutils [None req-fb2be1a6-8e8c-4f90-8667-a6240f41667f tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Acquiring lock "bba6f3d9-1be3-4048-86d5-f435511b0fc0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 833.234405] env[65918]: DEBUG oslo_concurrency.lockutils [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquiring lock "934b5745-6c9a-4d21-92b8-7505a170e600" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 858.164419] env[65918]: WARNING oslo_vmware.rw_handles [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 858.164419] env[65918]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 858.164419] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 858.164419] env[65918]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 858.164419] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 858.164419] env[65918]: ERROR oslo_vmware.rw_handles response.begin() [ 858.164419] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 858.164419] env[65918]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 858.164419] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 858.164419] env[65918]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 858.164419] env[65918]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 858.164419] env[65918]: ERROR oslo_vmware.rw_handles [ 858.165130] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Downloaded image file data e017c336-3a02-4b58-874a-44a1d1e154fd to vmware_temp/690480f6-c1f9-4e36-b760-5553e73abf18/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 858.166508] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Caching image {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 858.166763] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Copying Virtual Disk [datastore1] vmware_temp/690480f6-c1f9-4e36-b760-5553e73abf18/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk to [datastore1] vmware_temp/690480f6-c1f9-4e36-b760-5553e73abf18/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk {{(pid=65918) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 858.167055] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3bbfa07c-80b9-4e44-a211-6583cb7dad30 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.175397] env[65918]: DEBUG oslo_vmware.api [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Waiting for the task: (returnval){ [ 858.175397] env[65918]: value = "task-2848195" [ 858.175397] env[65918]: _type = "Task" [ 858.175397] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 858.182673] env[65918]: DEBUG oslo_vmware.api [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Task: {'id': task-2848195, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 858.685799] env[65918]: DEBUG oslo_vmware.exceptions [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Fault InvalidArgument not matched. {{(pid=65918) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 858.686560] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 858.686933] env[65918]: ERROR nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 858.686933] env[65918]: Faults: ['InvalidArgument'] [ 858.686933] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Traceback (most recent call last): [ 858.686933] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 858.686933] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] yield resources [ 858.686933] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 858.686933] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] self.driver.spawn(context, instance, image_meta, [ 858.686933] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 858.686933] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] self._vmops.spawn(context, instance, image_meta, injected_files, [ 858.686933] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 858.686933] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] self._fetch_image_if_missing(context, vi) [ 858.686933] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 858.687356] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] image_cache(vi, tmp_image_ds_loc) [ 858.687356] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 858.687356] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] vm_util.copy_virtual_disk( [ 858.687356] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 858.687356] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] session._wait_for_task(vmdk_copy_task) [ 858.687356] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 858.687356] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] return self.wait_for_task(task_ref) [ 858.687356] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 858.687356] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] return evt.wait() [ 858.687356] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 858.687356] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] result = hub.switch() [ 858.687356] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 858.687356] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] return self.greenlet.switch() [ 858.687770] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 858.687770] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] self.f(*self.args, **self.kw) [ 858.687770] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 858.687770] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] raise exceptions.translate_fault(task_info.error) [ 858.687770] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 858.687770] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Faults: ['InvalidArgument'] [ 858.687770] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] [ 858.687770] env[65918]: INFO nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Terminating instance [ 858.688798] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 858.689013] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 858.689261] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6babdacf-cf42-433f-80da-ce2bdacad332 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.691438] env[65918]: DEBUG nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 858.691632] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 858.692401] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b38d5155-5b51-41c1-ba5a-190ef03b38cb {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.699021] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 858.699231] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-94ec86f4-e4c1-4a3f-acc0-81201648f647 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.701331] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 858.701500] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 858.702516] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b4be7eed-877b-4368-a4cc-5fc31904908d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.706891] env[65918]: DEBUG oslo_vmware.api [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Waiting for the task: (returnval){ [ 858.706891] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52980182-4284-54a4-2436-e7cf519718ae" [ 858.706891] env[65918]: _type = "Task" [ 858.706891] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 858.713718] env[65918]: DEBUG oslo_vmware.api [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52980182-4284-54a4-2436-e7cf519718ae, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 858.764242] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 858.764511] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 858.764723] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Deleting the datastore file [datastore1] 7bc8087e-17e1-4cbd-84be-bd6c07e104ce {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 858.765020] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-83fbd621-7f29-4808-858c-b3b9d8b98439 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.771871] env[65918]: DEBUG oslo_vmware.api [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Waiting for the task: (returnval){ [ 858.771871] env[65918]: value = "task-2848197" [ 858.771871] env[65918]: _type = "Task" [ 858.771871] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 858.779386] env[65918]: DEBUG oslo_vmware.api [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Task: {'id': task-2848197, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 859.217708] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 859.217976] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Creating directory with path [datastore1] vmware_temp/ebc704ec-5f2c-45a7-bbc7-f16358999990/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 859.218229] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-971f5159-0b22-4eac-93bb-2cb8d92b1539 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.230279] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Created directory with path [datastore1] vmware_temp/ebc704ec-5f2c-45a7-bbc7-f16358999990/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 859.230479] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Fetch image to [datastore1] vmware_temp/ebc704ec-5f2c-45a7-bbc7-f16358999990/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 859.230644] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/ebc704ec-5f2c-45a7-bbc7-f16358999990/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 859.231451] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41c97ac7-c208-42f3-b6d4-0744119c7d4e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.238209] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e090f287-2089-46e6-bdbb-e7bf2ac14870 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.247556] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44b3831d-ea6f-4800-96e1-ee917b87a461 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.282697] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df79432d-b27f-46da-a23b-411502342a37 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.290140] env[65918]: DEBUG oslo_vmware.api [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Task: {'id': task-2848197, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082344} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 859.291605] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 859.291797] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 859.291992] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 859.292220] env[65918]: INFO nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Took 0.60 seconds to destroy the instance on the hypervisor. [ 859.294203] env[65918]: DEBUG nova.compute.claims [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 859.294371] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 859.294575] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 859.297017] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ae025012-2fe7-4cce-9079-33c8c83a5b96 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.316673] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 859.382899] env[65918]: DEBUG oslo_vmware.rw_handles [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ebc704ec-5f2c-45a7-bbc7-f16358999990/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 859.444065] env[65918]: DEBUG oslo_vmware.rw_handles [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Completed reading data from the image iterator. {{(pid=65918) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 859.444267] env[65918]: DEBUG oslo_vmware.rw_handles [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ebc704ec-5f2c-45a7-bbc7-f16358999990/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 859.681247] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11affa37-246d-4174-862d-0861934f64d8 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.688713] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab489003-e655-416a-aa46-a5943fb7f5f0 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.718489] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce3cff3e-e611-47b7-9e11-c60aa86f2b58 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.725265] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5314fae0-3165-449c-9143-8781c219384e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.738824] env[65918]: DEBUG nova.compute.provider_tree [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 859.747526] env[65918]: DEBUG nova.scheduler.client.report [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 859.761568] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.467s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 859.762137] env[65918]: ERROR nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 859.762137] env[65918]: Faults: ['InvalidArgument'] [ 859.762137] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Traceback (most recent call last): [ 859.762137] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 859.762137] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] self.driver.spawn(context, instance, image_meta, [ 859.762137] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 859.762137] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] self._vmops.spawn(context, instance, image_meta, injected_files, [ 859.762137] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 859.762137] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] self._fetch_image_if_missing(context, vi) [ 859.762137] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 859.762137] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] image_cache(vi, tmp_image_ds_loc) [ 859.762137] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 859.762529] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] vm_util.copy_virtual_disk( [ 859.762529] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 859.762529] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] session._wait_for_task(vmdk_copy_task) [ 859.762529] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 859.762529] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] return self.wait_for_task(task_ref) [ 859.762529] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 859.762529] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] return evt.wait() [ 859.762529] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 859.762529] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] result = hub.switch() [ 859.762529] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 859.762529] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] return self.greenlet.switch() [ 859.762529] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 859.762529] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] self.f(*self.args, **self.kw) [ 859.762897] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 859.762897] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] raise exceptions.translate_fault(task_info.error) [ 859.762897] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 859.762897] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Faults: ['InvalidArgument'] [ 859.762897] env[65918]: ERROR nova.compute.manager [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] [ 859.762897] env[65918]: DEBUG nova.compute.utils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] VimFaultException {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 859.764230] env[65918]: DEBUG nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Build of instance 7bc8087e-17e1-4cbd-84be-bd6c07e104ce was re-scheduled: A specified parameter was not correct: fileType [ 859.764230] env[65918]: Faults: ['InvalidArgument'] {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 859.764605] env[65918]: DEBUG nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 859.764785] env[65918]: DEBUG nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 859.764997] env[65918]: DEBUG nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 859.765209] env[65918]: DEBUG nova.network.neutron [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 860.017792] env[65918]: DEBUG nova.network.neutron [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 860.029585] env[65918]: INFO nova.compute.manager [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: 7bc8087e-17e1-4cbd-84be-bd6c07e104ce] Took 0.26 seconds to deallocate network for instance. [ 860.118741] env[65918]: INFO nova.scheduler.client.report [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Deleted allocations for instance 7bc8087e-17e1-4cbd-84be-bd6c07e104ce [ 860.134281] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3f51452c-013c-4839-b871-ce26201e6683 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "7bc8087e-17e1-4cbd-84be-bd6c07e104ce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.158s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 860.150031] env[65918]: DEBUG nova.compute.manager [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 860.198929] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 860.199337] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 860.200854] env[65918]: INFO nova.compute.claims [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 860.536827] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdd5f5ad-4086-4a7f-9688-ae4d385161c8 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.544712] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0af55765-4926-43a1-9f41-223c505c9c48 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.575064] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce487dd4-3775-4c26-9d18-62acbd536743 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.581331] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cad3ad3-19fe-4447-966f-de094c81b866 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.593997] env[65918]: DEBUG nova.compute.provider_tree [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 860.603508] env[65918]: DEBUG nova.scheduler.client.report [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 860.616183] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.417s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 860.616655] env[65918]: DEBUG nova.compute.manager [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 860.648027] env[65918]: DEBUG nova.compute.utils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 860.649689] env[65918]: DEBUG nova.compute.manager [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 860.649689] env[65918]: DEBUG nova.network.neutron [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 860.657652] env[65918]: DEBUG nova.compute.manager [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 860.724921] env[65918]: DEBUG nova.compute.manager [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 860.733584] env[65918]: DEBUG nova.policy [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6a69e324d2414841b503bcb6c6741afb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '173553df27684179bf878821a1268af1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 860.749598] env[65918]: DEBUG nova.virt.hardware [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 860.749840] env[65918]: DEBUG nova.virt.hardware [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 860.749997] env[65918]: DEBUG nova.virt.hardware [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 860.750200] env[65918]: DEBUG nova.virt.hardware [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 860.750350] env[65918]: DEBUG nova.virt.hardware [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 860.750497] env[65918]: DEBUG nova.virt.hardware [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 860.750701] env[65918]: DEBUG nova.virt.hardware [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 860.750859] env[65918]: DEBUG nova.virt.hardware [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 860.751031] env[65918]: DEBUG nova.virt.hardware [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 860.751204] env[65918]: DEBUG nova.virt.hardware [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 860.751372] env[65918]: DEBUG nova.virt.hardware [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 860.752227] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7ba5ab8-260c-413c-9bfc-e140ec9b5efe {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.760104] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-110028bd-5d0a-4b97-b6c1-fdf19f5dc7c1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 861.077014] env[65918]: DEBUG nova.network.neutron [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Successfully created port: d3eb9665-bebf-45d4-8ebf-32c0202dca66 {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 861.805595] env[65918]: DEBUG nova.network.neutron [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Successfully updated port: d3eb9665-bebf-45d4-8ebf-32c0202dca66 {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 861.818681] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Acquiring lock "refresh_cache-c04e5253-0275-4fb3-8eca-6a395c95930f" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 861.819033] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Acquired lock "refresh_cache-c04e5253-0275-4fb3-8eca-6a395c95930f" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 861.819329] env[65918]: DEBUG nova.network.neutron [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 861.863348] env[65918]: DEBUG nova.network.neutron [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 862.067825] env[65918]: DEBUG nova.network.neutron [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Updating instance_info_cache with network_info: [{"id": "d3eb9665-bebf-45d4-8ebf-32c0202dca66", "address": "fa:16:3e:45:a8:46", "network": {"id": "087e1586-cecf-4c8e-9b6b-042234a4529c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-874008442-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "173553df27684179bf878821a1268af1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "257e5ea7-8b80-4301-9900-a754f1fe2031", "external-id": "nsx-vlan-transportzone-682", "segmentation_id": 682, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd3eb9665-be", "ovs_interfaceid": "d3eb9665-bebf-45d4-8ebf-32c0202dca66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 862.078904] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Releasing lock "refresh_cache-c04e5253-0275-4fb3-8eca-6a395c95930f" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 862.079220] env[65918]: DEBUG nova.compute.manager [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Instance network_info: |[{"id": "d3eb9665-bebf-45d4-8ebf-32c0202dca66", "address": "fa:16:3e:45:a8:46", "network": {"id": "087e1586-cecf-4c8e-9b6b-042234a4529c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-874008442-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "173553df27684179bf878821a1268af1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "257e5ea7-8b80-4301-9900-a754f1fe2031", "external-id": "nsx-vlan-transportzone-682", "segmentation_id": 682, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd3eb9665-be", "ovs_interfaceid": "d3eb9665-bebf-45d4-8ebf-32c0202dca66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 862.079594] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:45:a8:46', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '257e5ea7-8b80-4301-9900-a754f1fe2031', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd3eb9665-bebf-45d4-8ebf-32c0202dca66', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 862.087279] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Creating folder: Project (173553df27684179bf878821a1268af1). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 862.087803] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-473ea899-db79-4e0f-b50b-8976fa15e77c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 862.094787] env[65918]: DEBUG nova.compute.manager [req-cff26e5b-e3a3-4273-9591-a483df74f323 req-15efa7f0-177b-4272-aa36-7a58061118c5 service nova] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Received event network-vif-plugged-d3eb9665-bebf-45d4-8ebf-32c0202dca66 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 862.094987] env[65918]: DEBUG oslo_concurrency.lockutils [req-cff26e5b-e3a3-4273-9591-a483df74f323 req-15efa7f0-177b-4272-aa36-7a58061118c5 service nova] Acquiring lock "c04e5253-0275-4fb3-8eca-6a395c95930f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 862.095200] env[65918]: DEBUG oslo_concurrency.lockutils [req-cff26e5b-e3a3-4273-9591-a483df74f323 req-15efa7f0-177b-4272-aa36-7a58061118c5 service nova] Lock "c04e5253-0275-4fb3-8eca-6a395c95930f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 862.095358] env[65918]: DEBUG oslo_concurrency.lockutils [req-cff26e5b-e3a3-4273-9591-a483df74f323 req-15efa7f0-177b-4272-aa36-7a58061118c5 service nova] Lock "c04e5253-0275-4fb3-8eca-6a395c95930f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 862.095531] env[65918]: DEBUG nova.compute.manager [req-cff26e5b-e3a3-4273-9591-a483df74f323 req-15efa7f0-177b-4272-aa36-7a58061118c5 service nova] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] No waiting events found dispatching network-vif-plugged-d3eb9665-bebf-45d4-8ebf-32c0202dca66 {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 862.095688] env[65918]: WARNING nova.compute.manager [req-cff26e5b-e3a3-4273-9591-a483df74f323 req-15efa7f0-177b-4272-aa36-7a58061118c5 service nova] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Received unexpected event network-vif-plugged-d3eb9665-bebf-45d4-8ebf-32c0202dca66 for instance with vm_state building and task_state spawning. [ 862.095840] env[65918]: DEBUG nova.compute.manager [req-cff26e5b-e3a3-4273-9591-a483df74f323 req-15efa7f0-177b-4272-aa36-7a58061118c5 service nova] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Received event network-changed-d3eb9665-bebf-45d4-8ebf-32c0202dca66 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 862.095987] env[65918]: DEBUG nova.compute.manager [req-cff26e5b-e3a3-4273-9591-a483df74f323 req-15efa7f0-177b-4272-aa36-7a58061118c5 service nova] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Refreshing instance network info cache due to event network-changed-d3eb9665-bebf-45d4-8ebf-32c0202dca66. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 862.096173] env[65918]: DEBUG oslo_concurrency.lockutils [req-cff26e5b-e3a3-4273-9591-a483df74f323 req-15efa7f0-177b-4272-aa36-7a58061118c5 service nova] Acquiring lock "refresh_cache-c04e5253-0275-4fb3-8eca-6a395c95930f" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 862.096305] env[65918]: DEBUG oslo_concurrency.lockutils [req-cff26e5b-e3a3-4273-9591-a483df74f323 req-15efa7f0-177b-4272-aa36-7a58061118c5 service nova] Acquired lock "refresh_cache-c04e5253-0275-4fb3-8eca-6a395c95930f" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 862.096452] env[65918]: DEBUG nova.network.neutron [req-cff26e5b-e3a3-4273-9591-a483df74f323 req-15efa7f0-177b-4272-aa36-7a58061118c5 service nova] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Refreshing network info cache for port d3eb9665-bebf-45d4-8ebf-32c0202dca66 {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 862.100029] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Created folder: Project (173553df27684179bf878821a1268af1) in parent group-v572679. [ 862.100212] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Creating folder: Instances. Parent ref: group-v572724. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 862.100428] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-df6a4fdc-4304-4098-aa51-268ea820c304 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 862.109560] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Created folder: Instances in parent group-v572724. [ 862.109786] env[65918]: DEBUG oslo.service.loopingcall [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 862.109953] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 862.110154] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bac8993c-9367-4ff5-b536-1e516c70d359 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 862.132061] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 862.132061] env[65918]: value = "task-2848200" [ 862.132061] env[65918]: _type = "Task" [ 862.132061] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 862.143529] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848200, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 862.580248] env[65918]: DEBUG nova.network.neutron [req-cff26e5b-e3a3-4273-9591-a483df74f323 req-15efa7f0-177b-4272-aa36-7a58061118c5 service nova] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Updated VIF entry in instance network info cache for port d3eb9665-bebf-45d4-8ebf-32c0202dca66. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 862.580670] env[65918]: DEBUG nova.network.neutron [req-cff26e5b-e3a3-4273-9591-a483df74f323 req-15efa7f0-177b-4272-aa36-7a58061118c5 service nova] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Updating instance_info_cache with network_info: [{"id": "d3eb9665-bebf-45d4-8ebf-32c0202dca66", "address": "fa:16:3e:45:a8:46", "network": {"id": "087e1586-cecf-4c8e-9b6b-042234a4529c", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-874008442-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "173553df27684179bf878821a1268af1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "257e5ea7-8b80-4301-9900-a754f1fe2031", "external-id": "nsx-vlan-transportzone-682", "segmentation_id": 682, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd3eb9665-be", "ovs_interfaceid": "d3eb9665-bebf-45d4-8ebf-32c0202dca66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 862.590595] env[65918]: DEBUG oslo_concurrency.lockutils [req-cff26e5b-e3a3-4273-9591-a483df74f323 req-15efa7f0-177b-4272-aa36-7a58061118c5 service nova] Releasing lock "refresh_cache-c04e5253-0275-4fb3-8eca-6a395c95930f" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 862.642248] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848200, 'name': CreateVM_Task} progress is 25%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 863.142831] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848200, 'name': CreateVM_Task} progress is 25%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 863.643892] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848200, 'name': CreateVM_Task} progress is 25%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 864.144490] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848200, 'name': CreateVM_Task} progress is 25%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 864.646630] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848200, 'name': CreateVM_Task, 'duration_secs': 2.112328} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 864.646812] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 864.647486] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 864.647649] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 864.647954] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 864.648207] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ef2c4301-3e2d-47b7-b696-8208b4c6d00e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.652646] env[65918]: DEBUG oslo_vmware.api [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Waiting for the task: (returnval){ [ 864.652646] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52ebaf4c-894a-ca04-69f4-23d7dc20b76c" [ 864.652646] env[65918]: _type = "Task" [ 864.652646] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 864.660117] env[65918]: DEBUG oslo_vmware.api [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52ebaf4c-894a-ca04-69f4-23d7dc20b76c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 865.163883] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 865.163883] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 865.163883] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 870.423524] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 870.423791] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Cleaning up deleted instances {{(pid=65918) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 870.441939] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] There are 0 instances to clean {{(pid=65918) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 870.442192] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 870.442360] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Cleaning up deleted instances with incomplete migration {{(pid=65918) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 870.451678] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 871.452467] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 871.452759] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager.update_available_resource {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 871.462453] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 871.462688] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 871.462857] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 871.463025] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65918) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 871.464096] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93f62384-988d-43bf-aeee-4e51379504fe {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.472883] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-610d0c7a-a9d1-463d-bc79-b8dbcdeb443f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.487017] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2e5a2f8-bfac-4f9d-82fb-1e7876cf8061 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.493445] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a6b9a3b-50cd-4323-a09f-1fb23f7137f1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.523607] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181073MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65918) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 871.523773] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 871.523974] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 871.587898] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 5c83d7da-f63b-40b7-a1aa-916ba9343439 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 871.588074] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bd0158bd-e255-4680-b00e-81eb1ce88ad5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 871.588203] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 51163f89-c8b6-48a8-bbbe-de63c44d92a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 871.588330] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 871.588446] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 934b5745-6c9a-4d21-92b8-7505a170e600 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 871.588564] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bba6f3d9-1be3-4048-86d5-f435511b0fc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 871.588681] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 3b3f8c10-5ba5-445c-a51d-5404874df3d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 871.588795] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 0ccebca0-a1a4-48b2-9154-1c73350dab38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 871.588907] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 871.589028] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance c04e5253-0275-4fb3-8eca-6a395c95930f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 871.599854] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 190285fc-ed83-417a-90db-b3c94feb4ce3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.609479] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 12059024-91af-400d-be6b-36fe9482b22b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.618342] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 78576ca1-7755-4532-82ee-de46c9d3a1fc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.627365] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bac2eb3c-464c-4859-afa6-d7a16ec452a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.635943] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance a95f80f7-67e0-4f35-a3fb-5ecd02c783ba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.644349] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 89dd139c-4533-4d48-aefa-750086205ad1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.652814] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 46e1dfe1-df73-430c-85ef-f5753974eed0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.661192] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance c07fe815-199c-41e8-b102-d2ffae0bb12c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.669842] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 6af69c4f-4822-4170-94a0-cdc587c825f7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.678271] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance efe8bbf3-f76d-4509-85d1-ffff559358b5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.686254] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 1e1d75b1-b4c3-4c72-a8e9-4e2f1b5103d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.694921] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance b94157e3-2da8-4709-b1cf-b2bb14e0a6f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.703347] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 529e8acc-2775-4eac-8e99-7e901a08f1d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.712842] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance f26e4561-b450-4582-a415-a90a4dda7837 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.722477] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance c9932955-3b82-4c30-9441-b33695340ed2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.722780] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 871.723034] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 872.007171] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3222ed46-e873-4b52-8e29-8b01e2e1f465 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.014532] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-698defaa-741d-40ca-8fb6-50e3761cebb3 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.044208] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddafc44b-61b5-4a9b-9ccd-bd428c589fda {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.050795] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a2d7402-1900-4f05-8e07-d3785655456b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.063797] env[65918]: DEBUG nova.compute.provider_tree [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 872.071830] env[65918]: DEBUG nova.scheduler.client.report [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 872.084562] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65918) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 872.084745] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.561s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 873.055393] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 873.055696] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Starting heal instance info cache {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 873.056658] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Rebuilding the list of instances to heal {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 873.074980] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 873.075164] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 873.075301] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 873.075430] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 873.075586] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 873.075714] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 873.075834] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 873.075955] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 873.076088] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 873.077045] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 873.077045] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Didn't find any instances for network info cache update. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 873.077045] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 873.077045] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 873.077246] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65918) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 873.424115] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 874.423392] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 876.424537] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 876.424842] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 897.950529] env[65918]: DEBUG nova.compute.manager [req-95d16dfc-7bf8-45ad-9d2b-0d3b47b7b99d req-2ecf789f-cf34-418d-aa99-b3c79c948246 service nova] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Received event network-vif-deleted-e22ef964-3212-4cc4-b343-cd2af9c8a17d {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 902.232761] env[65918]: DEBUG nova.compute.manager [req-a558cfc1-fbb3-4f8a-af45-3c91432b7d7e req-0177a792-944d-4474-ad9d-f50d004b65e8 service nova] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Received event network-vif-deleted-fa0df58d-7dc1-4739-857b-a6dacaf24577 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 904.426729] env[65918]: DEBUG nova.compute.manager [req-befc43df-67bd-4cc6-b1b2-c706e3005415 req-9b283c3a-d83d-44a8-9621-5a5f4b74925e service nova] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Received event network-vif-deleted-d3eb9665-bebf-45d4-8ebf-32c0202dca66 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 904.427031] env[65918]: DEBUG nova.compute.manager [req-befc43df-67bd-4cc6-b1b2-c706e3005415 req-9b283c3a-d83d-44a8-9621-5a5f4b74925e service nova] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Received event network-vif-deleted-4dae9f4f-c480-4bc3-9495-a21cb248ee3d {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 904.825318] env[65918]: WARNING oslo_vmware.rw_handles [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 904.825318] env[65918]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 904.825318] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 904.825318] env[65918]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 904.825318] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 904.825318] env[65918]: ERROR oslo_vmware.rw_handles response.begin() [ 904.825318] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 904.825318] env[65918]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 904.825318] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 904.825318] env[65918]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 904.825318] env[65918]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 904.825318] env[65918]: ERROR oslo_vmware.rw_handles [ 904.825729] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Downloaded image file data e017c336-3a02-4b58-874a-44a1d1e154fd to vmware_temp/ebc704ec-5f2c-45a7-bbc7-f16358999990/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 904.828166] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Caching image {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 904.828418] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Copying Virtual Disk [datastore1] vmware_temp/ebc704ec-5f2c-45a7-bbc7-f16358999990/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk to [datastore1] vmware_temp/ebc704ec-5f2c-45a7-bbc7-f16358999990/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk {{(pid=65918) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 904.828713] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-16bcce4e-e6e0-4774-b2df-01f1b7caf3a5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 904.838571] env[65918]: DEBUG oslo_vmware.api [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Waiting for the task: (returnval){ [ 904.838571] env[65918]: value = "task-2848201" [ 904.838571] env[65918]: _type = "Task" [ 904.838571] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 904.848449] env[65918]: DEBUG oslo_vmware.api [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Task: {'id': task-2848201, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 905.351614] env[65918]: DEBUG oslo_vmware.exceptions [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Fault InvalidArgument not matched. {{(pid=65918) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 905.353171] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 905.354169] env[65918]: ERROR nova.compute.manager [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 905.354169] env[65918]: Faults: ['InvalidArgument'] [ 905.354169] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Traceback (most recent call last): [ 905.354169] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 905.354169] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] yield resources [ 905.354169] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 905.354169] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] self.driver.spawn(context, instance, image_meta, [ 905.354169] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 905.354169] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] self._vmops.spawn(context, instance, image_meta, injected_files, [ 905.354169] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 905.354169] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] self._fetch_image_if_missing(context, vi) [ 905.354169] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 905.354579] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] image_cache(vi, tmp_image_ds_loc) [ 905.354579] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 905.354579] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] vm_util.copy_virtual_disk( [ 905.354579] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 905.354579] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] session._wait_for_task(vmdk_copy_task) [ 905.354579] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 905.354579] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] return self.wait_for_task(task_ref) [ 905.354579] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 905.354579] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] return evt.wait() [ 905.354579] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 905.354579] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] result = hub.switch() [ 905.354579] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 905.354579] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] return self.greenlet.switch() [ 905.354950] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 905.354950] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] self.f(*self.args, **self.kw) [ 905.354950] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 905.354950] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] raise exceptions.translate_fault(task_info.error) [ 905.354950] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 905.354950] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Faults: ['InvalidArgument'] [ 905.354950] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] [ 905.355579] env[65918]: INFO nova.compute.manager [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Terminating instance [ 905.360533] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquiring lock "refresh_cache-934b5745-6c9a-4d21-92b8-7505a170e600" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 905.360704] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquired lock "refresh_cache-934b5745-6c9a-4d21-92b8-7505a170e600" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 905.360897] env[65918]: DEBUG nova.network.neutron [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 905.361930] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 905.362148] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 905.362388] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3d905f70-2174-4738-9de2-67296b493942 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.373516] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 905.373698] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 905.377474] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-269480f9-eaa7-44cd-9755-c7368a43766a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.384009] env[65918]: DEBUG oslo_vmware.api [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Waiting for the task: (returnval){ [ 905.384009] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52491416-4cd3-d89a-b15b-470fea4a1015" [ 905.384009] env[65918]: _type = "Task" [ 905.384009] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 905.393282] env[65918]: DEBUG oslo_vmware.api [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52491416-4cd3-d89a-b15b-470fea4a1015, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 905.403094] env[65918]: DEBUG nova.network.neutron [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 905.533076] env[65918]: DEBUG nova.network.neutron [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 905.542821] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Releasing lock "refresh_cache-934b5745-6c9a-4d21-92b8-7505a170e600" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 905.543363] env[65918]: DEBUG nova.compute.manager [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 905.543606] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 905.545089] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c72693e8-d83c-4eba-bf84-11152b6cd2c9 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.556478] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 905.556478] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dd88e958-eb05-4d43-be8f-1dac1f23cda0 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.594920] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 905.594920] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 905.594920] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Deleting the datastore file [datastore1] 934b5745-6c9a-4d21-92b8-7505a170e600 {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 905.594920] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-23e3d4fd-eb31-4dce-b556-57d193ab33dc {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.604845] env[65918]: DEBUG oslo_vmware.api [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Waiting for the task: (returnval){ [ 905.604845] env[65918]: value = "task-2848203" [ 905.604845] env[65918]: _type = "Task" [ 905.604845] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 905.610992] env[65918]: DEBUG oslo_vmware.api [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Task: {'id': task-2848203, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 905.895102] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 905.895407] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Creating directory with path [datastore1] vmware_temp/2255a211-eba1-4be7-a5a0-a045525f4746/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 905.895666] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-52a11e69-21eb-4e18-a348-6cd1683ea4e5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.907775] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Created directory with path [datastore1] vmware_temp/2255a211-eba1-4be7-a5a0-a045525f4746/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 905.907775] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Fetch image to [datastore1] vmware_temp/2255a211-eba1-4be7-a5a0-a045525f4746/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 905.907925] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/2255a211-eba1-4be7-a5a0-a045525f4746/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 905.910255] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f3cca19-76d5-451d-99e5-58988a1dd0d9 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.915844] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e2a79a9-d12b-48ce-8aa8-673c5bc740c3 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.926750] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9211b371-6c66-45f6-a96c-dcd498737ef4 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.960342] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8a4bcf2-3292-471a-8a92-b161482bd2bd {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.969082] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7a1932fb-a8af-419f-8015-da99936f1206 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.990335] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 906.047990] env[65918]: DEBUG oslo_vmware.rw_handles [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2255a211-eba1-4be7-a5a0-a045525f4746/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 906.109577] env[65918]: DEBUG oslo_vmware.rw_handles [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Completed reading data from the image iterator. {{(pid=65918) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 906.109748] env[65918]: DEBUG oslo_vmware.rw_handles [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2255a211-eba1-4be7-a5a0-a045525f4746/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 906.113715] env[65918]: DEBUG oslo_vmware.api [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Task: {'id': task-2848203, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.034514} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 906.117030] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 906.117030] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 906.117030] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 906.117030] env[65918]: INFO nova.compute.manager [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Took 0.57 seconds to destroy the instance on the hypervisor. [ 906.117030] env[65918]: DEBUG oslo.service.loopingcall [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 906.117370] env[65918]: DEBUG nova.compute.manager [-] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Skipping network deallocation for instance since networking was not requested. {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 906.117975] env[65918]: DEBUG nova.compute.claims [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 906.117975] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 906.117975] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 906.266621] env[65918]: DEBUG nova.scheduler.client.report [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Refreshing inventories for resource provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 906.282144] env[65918]: DEBUG nova.scheduler.client.report [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Updating ProviderTree inventory for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 906.282385] env[65918]: DEBUG nova.compute.provider_tree [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Updating inventory in ProviderTree for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 906.293916] env[65918]: DEBUG nova.scheduler.client.report [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Refreshing aggregate associations for resource provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4, aggregates: None {{(pid=65918) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 906.317696] env[65918]: DEBUG nova.scheduler.client.report [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Refreshing trait associations for resource provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE {{(pid=65918) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 906.621229] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47e1eacd-d424-4206-9949-cabd41e26432 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.630151] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db98fa16-a1fa-4d57-ac11-d0911cf8d83b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.665509] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-671efdd2-c57a-49bb-8864-5dfdd7cd5a4d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.674276] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66077dfc-14e4-4b7e-92fd-d089b6b93331 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.692284] env[65918]: DEBUG nova.compute.provider_tree [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 906.702525] env[65918]: DEBUG nova.scheduler.client.report [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 906.716577] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.598s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 906.716913] env[65918]: ERROR nova.compute.manager [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 906.716913] env[65918]: Faults: ['InvalidArgument'] [ 906.716913] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Traceback (most recent call last): [ 906.716913] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 906.716913] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] self.driver.spawn(context, instance, image_meta, [ 906.716913] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 906.716913] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] self._vmops.spawn(context, instance, image_meta, injected_files, [ 906.716913] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 906.716913] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] self._fetch_image_if_missing(context, vi) [ 906.716913] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 906.716913] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] image_cache(vi, tmp_image_ds_loc) [ 906.716913] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 906.717312] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] vm_util.copy_virtual_disk( [ 906.717312] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 906.717312] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] session._wait_for_task(vmdk_copy_task) [ 906.717312] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 906.717312] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] return self.wait_for_task(task_ref) [ 906.717312] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 906.717312] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] return evt.wait() [ 906.717312] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 906.717312] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] result = hub.switch() [ 906.717312] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 906.717312] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] return self.greenlet.switch() [ 906.717312] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 906.717312] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] self.f(*self.args, **self.kw) [ 906.717707] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 906.717707] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] raise exceptions.translate_fault(task_info.error) [ 906.717707] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 906.717707] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Faults: ['InvalidArgument'] [ 906.717707] env[65918]: ERROR nova.compute.manager [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] [ 906.717707] env[65918]: DEBUG nova.compute.utils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] VimFaultException {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 906.719499] env[65918]: DEBUG nova.compute.manager [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Build of instance 934b5745-6c9a-4d21-92b8-7505a170e600 was re-scheduled: A specified parameter was not correct: fileType [ 906.719499] env[65918]: Faults: ['InvalidArgument'] {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 906.719814] env[65918]: DEBUG nova.compute.manager [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 906.720060] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquiring lock "refresh_cache-934b5745-6c9a-4d21-92b8-7505a170e600" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 906.720150] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquired lock "refresh_cache-934b5745-6c9a-4d21-92b8-7505a170e600" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 906.720310] env[65918]: DEBUG nova.network.neutron [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 906.755066] env[65918]: DEBUG nova.network.neutron [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 906.885460] env[65918]: DEBUG nova.network.neutron [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 906.894231] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Releasing lock "refresh_cache-934b5745-6c9a-4d21-92b8-7505a170e600" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 906.894321] env[65918]: DEBUG nova.compute.manager [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 906.894457] env[65918]: DEBUG nova.compute.manager [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Skipping network deallocation for instance since networking was not requested. {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 906.993452] env[65918]: INFO nova.scheduler.client.report [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Deleted allocations for instance 934b5745-6c9a-4d21-92b8-7505a170e600 [ 907.017376] env[65918]: DEBUG oslo_concurrency.lockutils [None req-dae53027-4723-46bf-a79e-3af5c5fe9f67 tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Lock "934b5745-6c9a-4d21-92b8-7505a170e600" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 275.290s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 907.017456] env[65918]: DEBUG oslo_concurrency.lockutils [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Lock "934b5745-6c9a-4d21-92b8-7505a170e600" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 73.783s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 907.017631] env[65918]: DEBUG oslo_concurrency.lockutils [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquiring lock "934b5745-6c9a-4d21-92b8-7505a170e600-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 907.017917] env[65918]: DEBUG oslo_concurrency.lockutils [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Lock "934b5745-6c9a-4d21-92b8-7505a170e600-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 907.018128] env[65918]: DEBUG oslo_concurrency.lockutils [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Lock "934b5745-6c9a-4d21-92b8-7505a170e600-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 907.020529] env[65918]: INFO nova.compute.manager [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Terminating instance [ 907.023333] env[65918]: DEBUG oslo_concurrency.lockutils [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquiring lock "refresh_cache-934b5745-6c9a-4d21-92b8-7505a170e600" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 907.023552] env[65918]: DEBUG oslo_concurrency.lockutils [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Acquired lock "refresh_cache-934b5745-6c9a-4d21-92b8-7505a170e600" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 907.023969] env[65918]: DEBUG nova.network.neutron [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 907.031834] env[65918]: DEBUG nova.compute.manager [None req-41df5eb8-b4fa-4d6e-af83-77fa3c068667 tempest-ServersNegativeTestMultiTenantJSON-1030047277 tempest-ServersNegativeTestMultiTenantJSON-1030047277-project-member] [instance: 190285fc-ed83-417a-90db-b3c94feb4ce3] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 907.058503] env[65918]: DEBUG nova.network.neutron [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 907.067263] env[65918]: DEBUG nova.compute.manager [None req-41df5eb8-b4fa-4d6e-af83-77fa3c068667 tempest-ServersNegativeTestMultiTenantJSON-1030047277 tempest-ServersNegativeTestMultiTenantJSON-1030047277-project-member] [instance: 190285fc-ed83-417a-90db-b3c94feb4ce3] Instance disappeared before build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 907.093255] env[65918]: DEBUG oslo_concurrency.lockutils [None req-41df5eb8-b4fa-4d6e-af83-77fa3c068667 tempest-ServersNegativeTestMultiTenantJSON-1030047277 tempest-ServersNegativeTestMultiTenantJSON-1030047277-project-member] Lock "190285fc-ed83-417a-90db-b3c94feb4ce3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.307s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 907.106261] env[65918]: DEBUG nova.compute.manager [None req-2de2001b-6e31-4a3e-ba47-25f1155cd840 tempest-ServersTestMultiNic-1319304383 tempest-ServersTestMultiNic-1319304383-project-member] [instance: 12059024-91af-400d-be6b-36fe9482b22b] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 907.135209] env[65918]: DEBUG nova.compute.manager [None req-2de2001b-6e31-4a3e-ba47-25f1155cd840 tempest-ServersTestMultiNic-1319304383 tempest-ServersTestMultiNic-1319304383-project-member] [instance: 12059024-91af-400d-be6b-36fe9482b22b] Instance disappeared before build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 907.135209] env[65918]: DEBUG nova.network.neutron [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 907.145020] env[65918]: DEBUG oslo_concurrency.lockutils [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Releasing lock "refresh_cache-934b5745-6c9a-4d21-92b8-7505a170e600" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 907.145020] env[65918]: DEBUG nova.compute.manager [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 907.145020] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 907.145020] env[65918]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-65885f5c-299f-4de3-8755-250a36e75962 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.155862] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97753bf6-0a0f-49fb-98ff-28632a45b9e2 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.169972] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2de2001b-6e31-4a3e-ba47-25f1155cd840 tempest-ServersTestMultiNic-1319304383 tempest-ServersTestMultiNic-1319304383-project-member] Lock "12059024-91af-400d-be6b-36fe9482b22b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.959s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 907.183015] env[65918]: DEBUG nova.compute.manager [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 907.192557] env[65918]: WARNING nova.virt.vmwareapi.vmops [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 934b5745-6c9a-4d21-92b8-7505a170e600 could not be found. [ 907.192944] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 907.193273] env[65918]: INFO nova.compute.manager [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Took 0.05 seconds to destroy the instance on the hypervisor. [ 907.193785] env[65918]: DEBUG oslo.service.loopingcall [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 907.194387] env[65918]: DEBUG nova.compute.manager [-] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 907.194634] env[65918]: DEBUG nova.network.neutron [-] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 907.212600] env[65918]: DEBUG nova.network.neutron [-] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 907.223028] env[65918]: DEBUG nova.network.neutron [-] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 907.244268] env[65918]: INFO nova.compute.manager [-] [instance: 934b5745-6c9a-4d21-92b8-7505a170e600] Took 0.05 seconds to deallocate network for instance. [ 907.252925] env[65918]: DEBUG oslo_concurrency.lockutils [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 907.253193] env[65918]: DEBUG oslo_concurrency.lockutils [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 907.255063] env[65918]: INFO nova.compute.claims [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 907.393761] env[65918]: DEBUG oslo_concurrency.lockutils [None req-db2dabec-fe32-42c0-8e31-6056905342bc tempest-ServersAdmin275Test-1531822256 tempest-ServersAdmin275Test-1531822256-project-member] Lock "934b5745-6c9a-4d21-92b8-7505a170e600" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.376s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 907.592018] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-391d2102-8da6-4849-bc39-d6fa1ae672a1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.601394] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edadc2e4-e5e4-4191-ac20-4322a86a6150 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.639266] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1953a0a-7117-460c-a2cf-2fae972b46db {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.646869] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84ce715b-14a2-4ce0-b763-10fc2b30c935 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.660583] env[65918]: DEBUG nova.compute.provider_tree [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 907.669929] env[65918]: DEBUG nova.scheduler.client.report [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 907.687104] env[65918]: DEBUG oslo_concurrency.lockutils [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.434s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 907.687689] env[65918]: DEBUG nova.compute.manager [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 907.713362] env[65918]: DEBUG nova.compute.claims [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 907.713619] env[65918]: DEBUG oslo_concurrency.lockutils [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 907.713957] env[65918]: DEBUG oslo_concurrency.lockutils [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 907.743305] env[65918]: DEBUG oslo_concurrency.lockutils [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.029s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 907.744248] env[65918]: DEBUG nova.compute.utils [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Instance 78576ca1-7755-4532-82ee-de46c9d3a1fc could not be found. {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 907.745653] env[65918]: DEBUG nova.compute.manager [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Instance disappeared during build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 907.745815] env[65918]: DEBUG nova.compute.manager [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 907.746031] env[65918]: DEBUG oslo_concurrency.lockutils [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] Acquiring lock "refresh_cache-78576ca1-7755-4532-82ee-de46c9d3a1fc" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 907.746180] env[65918]: DEBUG oslo_concurrency.lockutils [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] Acquired lock "refresh_cache-78576ca1-7755-4532-82ee-de46c9d3a1fc" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 907.746336] env[65918]: DEBUG nova.network.neutron [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 907.759746] env[65918]: DEBUG nova.compute.utils [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Can not refresh info_cache because instance was not found {{(pid=65918) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 907.794053] env[65918]: DEBUG nova.network.neutron [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 908.368979] env[65918]: DEBUG nova.network.neutron [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 908.380186] env[65918]: DEBUG oslo_concurrency.lockutils [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] Releasing lock "refresh_cache-78576ca1-7755-4532-82ee-de46c9d3a1fc" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 908.380186] env[65918]: DEBUG nova.compute.manager [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 908.380186] env[65918]: DEBUG nova.compute.manager [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 908.380186] env[65918]: DEBUG nova.network.neutron [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 908.428168] env[65918]: DEBUG nova.network.neutron [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 908.439875] env[65918]: DEBUG nova.network.neutron [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 908.457671] env[65918]: INFO nova.compute.manager [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Took 0.08 seconds to deallocate network for instance. [ 908.523265] env[65918]: DEBUG oslo_concurrency.lockutils [None req-a6ad83fa-20b6-48f3-bd50-e393db572a46 tempest-ServerTagsTestJSON-733766138 tempest-ServerTagsTestJSON-733766138-project-member] Lock "78576ca1-7755-4532-82ee-de46c9d3a1fc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.815s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.537308] env[65918]: DEBUG nova.compute.manager [None req-e9bad983-af7e-4d4a-a324-eca345a65263 tempest-ServerMetadataTestJSON-324132329 tempest-ServerMetadataTestJSON-324132329-project-member] [instance: bac2eb3c-464c-4859-afa6-d7a16ec452a0] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 908.568124] env[65918]: DEBUG nova.compute.manager [None req-e9bad983-af7e-4d4a-a324-eca345a65263 tempest-ServerMetadataTestJSON-324132329 tempest-ServerMetadataTestJSON-324132329-project-member] [instance: bac2eb3c-464c-4859-afa6-d7a16ec452a0] Instance disappeared before build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 908.589887] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e9bad983-af7e-4d4a-a324-eca345a65263 tempest-ServerMetadataTestJSON-324132329 tempest-ServerMetadataTestJSON-324132329-project-member] Lock "bac2eb3c-464c-4859-afa6-d7a16ec452a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.928s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.616019] env[65918]: DEBUG nova.compute.manager [None req-2c4ae3ce-20f3-4a08-b1a6-ec9ef3e26f9e tempest-ServersTestJSON-1292267645 tempest-ServersTestJSON-1292267645-project-member] [instance: a95f80f7-67e0-4f35-a3fb-5ecd02c783ba] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 908.641745] env[65918]: DEBUG nova.compute.manager [None req-2c4ae3ce-20f3-4a08-b1a6-ec9ef3e26f9e tempest-ServersTestJSON-1292267645 tempest-ServersTestJSON-1292267645-project-member] [instance: a95f80f7-67e0-4f35-a3fb-5ecd02c783ba] Instance disappeared before build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 908.673720] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2c4ae3ce-20f3-4a08-b1a6-ec9ef3e26f9e tempest-ServersTestJSON-1292267645 tempest-ServersTestJSON-1292267645-project-member] Lock "a95f80f7-67e0-4f35-a3fb-5ecd02c783ba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.706s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.687835] env[65918]: DEBUG nova.compute.manager [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 908.756036] env[65918]: DEBUG oslo_concurrency.lockutils [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 908.756036] env[65918]: DEBUG oslo_concurrency.lockutils [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 908.757487] env[65918]: INFO nova.compute.claims [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 909.100724] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f714a6c4-345a-422b-a60e-84614b04d2ff {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.107813] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06ce42c9-b8b0-4db0-a626-20403b51bf96 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.142901] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37428fa3-210f-4b1f-9c95-7cd046372446 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.151946] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63e244d8-142d-4c8c-b0d2-11a18c3e7c29 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.165859] env[65918]: DEBUG nova.compute.provider_tree [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 909.176348] env[65918]: DEBUG nova.scheduler.client.report [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 909.190620] env[65918]: DEBUG oslo_concurrency.lockutils [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.435s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 909.191168] env[65918]: DEBUG nova.compute.manager [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 909.226984] env[65918]: DEBUG nova.compute.claims [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 909.226984] env[65918]: DEBUG oslo_concurrency.lockutils [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 909.226984] env[65918]: DEBUG oslo_concurrency.lockutils [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 909.480974] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3a12dca-cade-441f-8ee7-8dc9e1ca8ea6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.493468] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-155fefe8-c3ab-4367-ba33-61dcd7d97b1d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.528965] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b0e2444-9c55-43f6-90e8-d6e2df4a6c23 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.538119] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc5c5db6-5651-405a-a27b-429620f3f896 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.552017] env[65918]: DEBUG nova.compute.provider_tree [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 909.562097] env[65918]: DEBUG nova.scheduler.client.report [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 909.579712] env[65918]: DEBUG oslo_concurrency.lockutils [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.353s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 909.580571] env[65918]: DEBUG nova.compute.utils [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] Conflict updating instance 89dd139c-4533-4d48-aefa-750086205ad1. Expected: {'task_state': [None]}. Actual: {'task_state': 'deleting'} {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 909.582013] env[65918]: DEBUG nova.compute.manager [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] Instance disappeared during build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 909.582208] env[65918]: DEBUG nova.compute.manager [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 909.582437] env[65918]: DEBUG oslo_concurrency.lockutils [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Acquiring lock "refresh_cache-89dd139c-4533-4d48-aefa-750086205ad1" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 909.582583] env[65918]: DEBUG oslo_concurrency.lockutils [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Acquired lock "refresh_cache-89dd139c-4533-4d48-aefa-750086205ad1" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 909.582739] env[65918]: DEBUG nova.network.neutron [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 909.590536] env[65918]: DEBUG nova.compute.utils [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] Can not refresh info_cache because instance was not found {{(pid=65918) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 909.643153] env[65918]: DEBUG nova.network.neutron [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 910.088583] env[65918]: DEBUG nova.network.neutron [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 910.102853] env[65918]: DEBUG oslo_concurrency.lockutils [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Releasing lock "refresh_cache-89dd139c-4533-4d48-aefa-750086205ad1" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 910.102853] env[65918]: DEBUG nova.compute.manager [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 910.102853] env[65918]: DEBUG nova.compute.manager [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 910.102853] env[65918]: DEBUG nova.network.neutron [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 910.369113] env[65918]: DEBUG nova.network.neutron [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 910.378078] env[65918]: DEBUG nova.network.neutron [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 910.387189] env[65918]: INFO nova.compute.manager [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] [instance: 89dd139c-4533-4d48-aefa-750086205ad1] Took 0.28 seconds to deallocate network for instance. [ 910.446330] env[65918]: DEBUG oslo_concurrency.lockutils [None req-99bc0aed-9d41-40af-8116-4fa30aedc390 tempest-AttachInterfacesTestJSON-833441806 tempest-AttachInterfacesTestJSON-833441806-project-member] Lock "89dd139c-4533-4d48-aefa-750086205ad1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.241s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 910.460483] env[65918]: DEBUG nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 910.524837] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 910.525143] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 910.526663] env[65918]: INFO nova.compute.claims [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 910.981896] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8d5436e-27b6-4ef2-ab60-d627ce5e248b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 910.981896] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bafdd1bf-1140-4dd9-8646-4dae68c268ba {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 910.981896] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8d18886-028c-4cd4-b2cb-fd6c1af74f9f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 910.981896] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de8cdf2b-a97e-44ec-9790-5e1488b9a4b8 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 910.981896] env[65918]: DEBUG nova.compute.provider_tree [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 910.988959] env[65918]: DEBUG nova.scheduler.client.report [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 910.997931] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.471s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 910.997931] env[65918]: DEBUG nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 911.097986] env[65918]: DEBUG nova.compute.utils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 911.102404] env[65918]: DEBUG nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 911.102575] env[65918]: DEBUG nova.network.neutron [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 911.111334] env[65918]: DEBUG nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 911.237200] env[65918]: DEBUG nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 911.261679] env[65918]: DEBUG nova.virt.hardware [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 911.261907] env[65918]: DEBUG nova.virt.hardware [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 911.262092] env[65918]: DEBUG nova.virt.hardware [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 911.262398] env[65918]: DEBUG nova.virt.hardware [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 911.262398] env[65918]: DEBUG nova.virt.hardware [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 911.262556] env[65918]: DEBUG nova.virt.hardware [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 911.263208] env[65918]: DEBUG nova.virt.hardware [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 911.263208] env[65918]: DEBUG nova.virt.hardware [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 911.263208] env[65918]: DEBUG nova.virt.hardware [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 911.263208] env[65918]: DEBUG nova.virt.hardware [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 911.263408] env[65918]: DEBUG nova.virt.hardware [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 911.264481] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-512f369f-e95a-4c80-8cd9-b62a82e8293f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 911.273104] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5e1a768-8553-4a82-9271-4ae008472ee8 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 911.356430] env[65918]: DEBUG nova.policy [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '14f3ef568b4c4708b2369172706e1088', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1edb24dcd43040618b3fb71f20c2c1ae', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 912.594938] env[65918]: DEBUG nova.network.neutron [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Successfully created port: c7a51837-f5b6-45d1-a154-16ae330a4fa0 {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 913.386901] env[65918]: DEBUG nova.network.neutron [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Successfully updated port: c7a51837-f5b6-45d1-a154-16ae330a4fa0 {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 913.398693] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Acquiring lock "refresh_cache-46e1dfe1-df73-430c-85ef-f5753974eed0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 913.399097] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Acquired lock "refresh_cache-46e1dfe1-df73-430c-85ef-f5753974eed0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 913.399097] env[65918]: DEBUG nova.network.neutron [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 913.432094] env[65918]: DEBUG nova.network.neutron [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 913.432094] env[65918]: DEBUG nova.network.neutron [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 913.446589] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Releasing lock "refresh_cache-46e1dfe1-df73-430c-85ef-f5753974eed0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 913.446871] env[65918]: DEBUG nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Instance network_info: |[]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 913.447215] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Instance VIF info [] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 913.452755] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Creating folder: Project (1edb24dcd43040618b3fb71f20c2c1ae). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 913.453523] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5ecd9959-ba3b-40d4-ac7c-40c0f4fc0d07 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 913.466665] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Created folder: Project (1edb24dcd43040618b3fb71f20c2c1ae) in parent group-v572679. [ 913.466878] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Creating folder: Instances. Parent ref: group-v572727. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 913.467666] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3c89f124-f1f0-488f-9344-fd9d32a5d40d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 913.477923] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Created folder: Instances in parent group-v572727. [ 913.478172] env[65918]: DEBUG oslo.service.loopingcall [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 913.478354] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 913.478547] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-62b258df-d218-4ba3-a47a-20103981f12f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 913.495550] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 913.495550] env[65918]: value = "task-2848206" [ 913.495550] env[65918]: _type = "Task" [ 913.495550] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 913.503801] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848206, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 914.008223] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848206, 'name': CreateVM_Task, 'duration_secs': 0.240697} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 914.008592] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 914.008847] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 914.009050] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 914.009326] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 914.009564] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ac1ea90b-0ff7-43af-a687-6b05a9be1e0d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 914.015058] env[65918]: DEBUG oslo_vmware.api [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Waiting for the task: (returnval){ [ 914.015058] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52fc44e5-7a0e-a92e-25b6-ff74a53a62fe" [ 914.015058] env[65918]: _type = "Task" [ 914.015058] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 914.023222] env[65918]: DEBUG oslo_vmware.api [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52fc44e5-7a0e-a92e-25b6-ff74a53a62fe, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 914.373299] env[65918]: DEBUG nova.compute.manager [req-eee3f288-98a9-44a9-bd47-7b9d7f8c9331 req-c81cae26-1db4-4e43-a1f4-4477f59911de service nova] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Received event network-changed-c7a51837-f5b6-45d1-a154-16ae330a4fa0 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 914.373299] env[65918]: DEBUG nova.compute.manager [req-eee3f288-98a9-44a9-bd47-7b9d7f8c9331 req-c81cae26-1db4-4e43-a1f4-4477f59911de service nova] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Refreshing instance network info cache due to event network-changed-c7a51837-f5b6-45d1-a154-16ae330a4fa0. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 914.373299] env[65918]: DEBUG oslo_concurrency.lockutils [req-eee3f288-98a9-44a9-bd47-7b9d7f8c9331 req-c81cae26-1db4-4e43-a1f4-4477f59911de service nova] Acquiring lock "refresh_cache-46e1dfe1-df73-430c-85ef-f5753974eed0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 914.373548] env[65918]: DEBUG oslo_concurrency.lockutils [req-eee3f288-98a9-44a9-bd47-7b9d7f8c9331 req-c81cae26-1db4-4e43-a1f4-4477f59911de service nova] Acquired lock "refresh_cache-46e1dfe1-df73-430c-85ef-f5753974eed0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 914.373548] env[65918]: DEBUG nova.network.neutron [req-eee3f288-98a9-44a9-bd47-7b9d7f8c9331 req-c81cae26-1db4-4e43-a1f4-4477f59911de service nova] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Refreshing network info cache for port c7a51837-f5b6-45d1-a154-16ae330a4fa0 {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 914.397997] env[65918]: DEBUG nova.network.neutron [req-eee3f288-98a9-44a9-bd47-7b9d7f8c9331 req-c81cae26-1db4-4e43-a1f4-4477f59911de service nova] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 914.526832] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 914.527472] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 914.527797] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 914.564789] env[65918]: DEBUG nova.network.neutron [req-eee3f288-98a9-44a9-bd47-7b9d7f8c9331 req-c81cae26-1db4-4e43-a1f4-4477f59911de service nova] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Instance is deleted, no further info cache update {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:106}} [ 914.564955] env[65918]: DEBUG oslo_concurrency.lockutils [req-eee3f288-98a9-44a9-bd47-7b9d7f8c9331 req-c81cae26-1db4-4e43-a1f4-4477f59911de service nova] Releasing lock "refresh_cache-46e1dfe1-df73-430c-85ef-f5753974eed0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 932.419622] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 932.423322] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 932.423511] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Starting heal instance info cache {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 932.423634] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Rebuilding the list of instances to heal {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 932.448057] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 932.448152] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 932.448293] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 932.448423] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 932.448547] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 932.448666] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Didn't find any instances for network info cache update. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 933.425697] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 933.426132] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager.update_available_resource {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 933.438175] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 933.441832] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 933.441832] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 933.441832] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65918) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 933.441832] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f0c03ba-bf8e-4b5f-9ec4-29c5a65be5ce {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 933.449317] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b60c7a22-35d8-4bf1-a287-4d7dbb0392ce {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 933.463917] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da895f70-bc82-4433-9ffa-4a968621b3bb {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 933.471703] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc8506ef-1aee-43d7-9959-1fa32e671a59 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 933.506538] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181031MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65918) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 933.509992] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 933.509992] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 933.566939] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 5c83d7da-f63b-40b7-a1aa-916ba9343439 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 933.567185] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bd0158bd-e255-4680-b00e-81eb1ce88ad5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 933.567383] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 51163f89-c8b6-48a8-bbbe-de63c44d92a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 933.567528] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 933.567655] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bba6f3d9-1be3-4048-86d5-f435511b0fc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 933.581972] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance c9932955-3b82-4c30-9441-b33695340ed2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 933.582233] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 933.582388] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 933.702854] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5af88220-52be-4947-94dd-fd5d3e8d89ca {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 933.712960] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2399e03f-27dc-47f8-8700-18e4e1afaf5a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 933.744548] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4e14f37-1c64-4f18-9828-90c548ad83bb {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 933.757896] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6685cb54-4f19-43ec-b0e8-02e5cba0a070 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 933.776908] env[65918]: DEBUG nova.compute.provider_tree [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 933.790535] env[65918]: DEBUG nova.scheduler.client.report [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 933.812465] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65918) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 933.812678] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.306s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 934.811481] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 934.811801] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 934.811854] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65918) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 935.424741] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 936.424581] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 937.423562] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 939.419657] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 940.922548] env[65918]: DEBUG oslo_concurrency.lockutils [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Acquiring lock "3faaaacf-815e-4493-81a7-2a32f868442a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 940.922814] env[65918]: DEBUG oslo_concurrency.lockutils [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Lock "3faaaacf-815e-4493-81a7-2a32f868442a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 942.626713] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Acquiring lock "8017964a-7fe8-40eb-a79d-47e0401a27d1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 942.627011] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Lock "8017964a-7fe8-40eb-a79d-47e0401a27d1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 946.414662] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Acquiring lock "a0f06a58-65d2-4325-8f93-0948b4e5ac8c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 946.414957] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Lock "a0f06a58-65d2-4325-8f93-0948b4e5ac8c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 951.854028] env[65918]: WARNING oslo_vmware.rw_handles [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 951.854028] env[65918]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 951.854028] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 951.854028] env[65918]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 951.854028] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 951.854028] env[65918]: ERROR oslo_vmware.rw_handles response.begin() [ 951.854028] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 951.854028] env[65918]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 951.854028] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 951.854028] env[65918]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 951.854028] env[65918]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 951.854028] env[65918]: ERROR oslo_vmware.rw_handles [ 951.854028] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Downloaded image file data e017c336-3a02-4b58-874a-44a1d1e154fd to vmware_temp/2255a211-eba1-4be7-a5a0-a045525f4746/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 951.856014] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Caching image {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 951.856275] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Copying Virtual Disk [datastore1] vmware_temp/2255a211-eba1-4be7-a5a0-a045525f4746/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk to [datastore1] vmware_temp/2255a211-eba1-4be7-a5a0-a045525f4746/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk {{(pid=65918) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 951.856565] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-01592051-a90f-4900-a9f2-b25ede84448d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 951.865414] env[65918]: DEBUG oslo_vmware.api [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Waiting for the task: (returnval){ [ 951.865414] env[65918]: value = "task-2848207" [ 951.865414] env[65918]: _type = "Task" [ 951.865414] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 951.872925] env[65918]: DEBUG oslo_vmware.api [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Task: {'id': task-2848207, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 952.376061] env[65918]: DEBUG oslo_vmware.exceptions [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Fault InvalidArgument not matched. {{(pid=65918) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 952.376326] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 952.376900] env[65918]: ERROR nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 952.376900] env[65918]: Faults: ['InvalidArgument'] [ 952.376900] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Traceback (most recent call last): [ 952.376900] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 952.376900] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] yield resources [ 952.376900] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 952.376900] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] self.driver.spawn(context, instance, image_meta, [ 952.376900] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 952.376900] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] self._vmops.spawn(context, instance, image_meta, injected_files, [ 952.376900] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 952.376900] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] self._fetch_image_if_missing(context, vi) [ 952.376900] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 952.377346] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] image_cache(vi, tmp_image_ds_loc) [ 952.377346] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 952.377346] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] vm_util.copy_virtual_disk( [ 952.377346] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 952.377346] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] session._wait_for_task(vmdk_copy_task) [ 952.377346] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 952.377346] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] return self.wait_for_task(task_ref) [ 952.377346] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 952.377346] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] return evt.wait() [ 952.377346] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 952.377346] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] result = hub.switch() [ 952.377346] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 952.377346] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] return self.greenlet.switch() [ 952.377685] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 952.377685] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] self.f(*self.args, **self.kw) [ 952.377685] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 952.377685] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] raise exceptions.translate_fault(task_info.error) [ 952.377685] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 952.377685] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Faults: ['InvalidArgument'] [ 952.377685] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] [ 952.377685] env[65918]: INFO nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Terminating instance [ 952.378817] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 952.379051] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 952.379290] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-de1b18de-a8f0-4803-8fb3-73378c986e9e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.381335] env[65918]: DEBUG nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 952.381537] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 952.382258] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8576d0d6-98c8-467d-a7ef-77983b025d9c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.388796] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 952.389018] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1b9e9e4f-0e15-48e3-98bb-21e350bf3873 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.391042] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 952.391219] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 952.392113] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-44acf8e5-2cf3-4fbb-aa91-c466f66d406e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.396558] env[65918]: DEBUG oslo_vmware.api [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Waiting for the task: (returnval){ [ 952.396558] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52f18add-a58b-d832-50dd-850b32927b6b" [ 952.396558] env[65918]: _type = "Task" [ 952.396558] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 952.405271] env[65918]: DEBUG oslo_vmware.api [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52f18add-a58b-d832-50dd-850b32927b6b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 952.463678] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 952.463868] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 952.463988] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Deleting the datastore file [datastore1] 5c83d7da-f63b-40b7-a1aa-916ba9343439 {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 952.464265] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-57581e38-7e21-4642-b954-4d907548f4bf {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.470342] env[65918]: DEBUG oslo_vmware.api [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Waiting for the task: (returnval){ [ 952.470342] env[65918]: value = "task-2848209" [ 952.470342] env[65918]: _type = "Task" [ 952.470342] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 952.477707] env[65918]: DEBUG oslo_vmware.api [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Task: {'id': task-2848209, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 952.909560] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 952.909957] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Creating directory with path [datastore1] vmware_temp/ac2e0474-cf55-41fa-b631-d5dfe3c92248/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 952.910251] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-84a6740d-1e6c-42c0-95c2-d9b2e4a913c7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.924383] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Created directory with path [datastore1] vmware_temp/ac2e0474-cf55-41fa-b631-d5dfe3c92248/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 952.924577] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Fetch image to [datastore1] vmware_temp/ac2e0474-cf55-41fa-b631-d5dfe3c92248/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 952.924736] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/ac2e0474-cf55-41fa-b631-d5dfe3c92248/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 952.925487] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-032e6953-d1e9-49b0-8a88-c1d8a4606f05 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.931936] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b88e8d16-46da-46ee-a7f4-50109fe76c03 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.941498] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82931d91-5ece-4459-b2f0-9d90a1e804f2 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.975292] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d501ac4-a5ef-49d5-8013-eb5d5bfb2f1f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.983050] env[65918]: DEBUG oslo_vmware.api [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Task: {'id': task-2848209, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077199} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 952.984389] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 952.984586] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 952.984793] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 952.984988] env[65918]: INFO nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Took 0.60 seconds to destroy the instance on the hypervisor. [ 952.986721] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-492dbb29-eda3-4269-96e5-d07641490c1b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.988574] env[65918]: DEBUG nova.compute.claims [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 952.988740] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 952.988940] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 953.010770] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 953.106775] env[65918]: DEBUG oslo_vmware.rw_handles [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ac2e0474-cf55-41fa-b631-d5dfe3c92248/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 953.168044] env[65918]: DEBUG oslo_vmware.rw_handles [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Completed reading data from the image iterator. {{(pid=65918) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 953.168279] env[65918]: DEBUG oslo_vmware.rw_handles [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ac2e0474-cf55-41fa-b631-d5dfe3c92248/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 953.189502] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c91178b3-4b40-4df2-bd4a-88e5a3e6f51c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.197061] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14dd7da2-0776-4878-8457-97709ecf5942 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.227721] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f593f11b-d980-4a0a-9223-7d003743b02e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.234624] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3313bc0d-3c4e-4e18-ab1a-79469cfeb646 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.247467] env[65918]: DEBUG nova.compute.provider_tree [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 953.257114] env[65918]: DEBUG nova.scheduler.client.report [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 953.270116] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.281s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 953.270650] env[65918]: ERROR nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 953.270650] env[65918]: Faults: ['InvalidArgument'] [ 953.270650] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Traceback (most recent call last): [ 953.270650] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 953.270650] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] self.driver.spawn(context, instance, image_meta, [ 953.270650] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 953.270650] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] self._vmops.spawn(context, instance, image_meta, injected_files, [ 953.270650] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 953.270650] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] self._fetch_image_if_missing(context, vi) [ 953.270650] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 953.270650] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] image_cache(vi, tmp_image_ds_loc) [ 953.270650] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 953.270976] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] vm_util.copy_virtual_disk( [ 953.270976] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 953.270976] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] session._wait_for_task(vmdk_copy_task) [ 953.270976] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 953.270976] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] return self.wait_for_task(task_ref) [ 953.270976] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 953.270976] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] return evt.wait() [ 953.270976] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 953.270976] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] result = hub.switch() [ 953.270976] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 953.270976] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] return self.greenlet.switch() [ 953.270976] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 953.270976] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] self.f(*self.args, **self.kw) [ 953.271296] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 953.271296] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] raise exceptions.translate_fault(task_info.error) [ 953.271296] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 953.271296] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Faults: ['InvalidArgument'] [ 953.271296] env[65918]: ERROR nova.compute.manager [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] [ 953.271407] env[65918]: DEBUG nova.compute.utils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] VimFaultException {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 953.272773] env[65918]: DEBUG nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Build of instance 5c83d7da-f63b-40b7-a1aa-916ba9343439 was re-scheduled: A specified parameter was not correct: fileType [ 953.272773] env[65918]: Faults: ['InvalidArgument'] {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 953.273146] env[65918]: DEBUG nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 953.273316] env[65918]: DEBUG nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 953.273466] env[65918]: DEBUG nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 953.273621] env[65918]: DEBUG nova.network.neutron [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 953.529200] env[65918]: DEBUG nova.network.neutron [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 953.540916] env[65918]: INFO nova.compute.manager [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Took 0.27 seconds to deallocate network for instance. [ 953.630903] env[65918]: INFO nova.scheduler.client.report [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Deleted allocations for instance 5c83d7da-f63b-40b7-a1aa-916ba9343439 [ 953.646101] env[65918]: DEBUG oslo_concurrency.lockutils [None req-195a54e8-4279-468d-a0ec-899aa5dfab8b tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Lock "5c83d7da-f63b-40b7-a1aa-916ba9343439" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 325.560s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 953.647244] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4c57f0d-a21b-46bf-9f06-d1093ceaaecc tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Lock "5c83d7da-f63b-40b7-a1aa-916ba9343439" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 126.886s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 953.647464] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4c57f0d-a21b-46bf-9f06-d1093ceaaecc tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquiring lock "5c83d7da-f63b-40b7-a1aa-916ba9343439-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 953.647764] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4c57f0d-a21b-46bf-9f06-d1093ceaaecc tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Lock "5c83d7da-f63b-40b7-a1aa-916ba9343439-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 953.647844] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4c57f0d-a21b-46bf-9f06-d1093ceaaecc tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Lock "5c83d7da-f63b-40b7-a1aa-916ba9343439-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 953.649660] env[65918]: INFO nova.compute.manager [None req-e4c57f0d-a21b-46bf-9f06-d1093ceaaecc tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Terminating instance [ 953.652023] env[65918]: DEBUG nova.compute.manager [None req-e4c57f0d-a21b-46bf-9f06-d1093ceaaecc tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 953.652023] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e4c57f0d-a21b-46bf-9f06-d1093ceaaecc tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 953.652023] env[65918]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e64a0c1e-4029-400a-b977-a4a3146442f6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.660834] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab129734-6b1e-4270-ba1a-db98fb9a1efb {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.671626] env[65918]: DEBUG nova.compute.manager [None req-79c3c05b-9782-4d61-b5ee-7c85bd1cee6b tempest-DeleteServersTestJSON-2103343594 tempest-DeleteServersTestJSON-2103343594-project-member] [instance: c07fe815-199c-41e8-b102-d2ffae0bb12c] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 953.691395] env[65918]: WARNING nova.virt.vmwareapi.vmops [None req-e4c57f0d-a21b-46bf-9f06-d1093ceaaecc tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5c83d7da-f63b-40b7-a1aa-916ba9343439 could not be found. [ 953.691624] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e4c57f0d-a21b-46bf-9f06-d1093ceaaecc tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 953.691801] env[65918]: INFO nova.compute.manager [None req-e4c57f0d-a21b-46bf-9f06-d1093ceaaecc tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Took 0.04 seconds to destroy the instance on the hypervisor. [ 953.692048] env[65918]: DEBUG oslo.service.loopingcall [None req-e4c57f0d-a21b-46bf-9f06-d1093ceaaecc tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 953.692269] env[65918]: DEBUG nova.compute.manager [-] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 953.692363] env[65918]: DEBUG nova.network.neutron [-] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 953.694626] env[65918]: DEBUG nova.compute.manager [None req-79c3c05b-9782-4d61-b5ee-7c85bd1cee6b tempest-DeleteServersTestJSON-2103343594 tempest-DeleteServersTestJSON-2103343594-project-member] [instance: c07fe815-199c-41e8-b102-d2ffae0bb12c] Instance disappeared before build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 953.713290] env[65918]: DEBUG oslo_concurrency.lockutils [None req-79c3c05b-9782-4d61-b5ee-7c85bd1cee6b tempest-DeleteServersTestJSON-2103343594 tempest-DeleteServersTestJSON-2103343594-project-member] Lock "c07fe815-199c-41e8-b102-d2ffae0bb12c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.097s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 953.716624] env[65918]: DEBUG nova.network.neutron [-] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 953.721033] env[65918]: DEBUG nova.compute.manager [None req-83c5f202-7d7d-4f1a-83b6-697f2b8e6f7f tempest-MultipleCreateTestJSON-1481152747 tempest-MultipleCreateTestJSON-1481152747-project-member] [instance: 6af69c4f-4822-4170-94a0-cdc587c825f7] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 953.723487] env[65918]: INFO nova.compute.manager [-] [instance: 5c83d7da-f63b-40b7-a1aa-916ba9343439] Took 0.03 seconds to deallocate network for instance. [ 953.743560] env[65918]: DEBUG nova.compute.manager [None req-83c5f202-7d7d-4f1a-83b6-697f2b8e6f7f tempest-MultipleCreateTestJSON-1481152747 tempest-MultipleCreateTestJSON-1481152747-project-member] [instance: 6af69c4f-4822-4170-94a0-cdc587c825f7] Instance disappeared before build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 953.763029] env[65918]: DEBUG oslo_concurrency.lockutils [None req-83c5f202-7d7d-4f1a-83b6-697f2b8e6f7f tempest-MultipleCreateTestJSON-1481152747 tempest-MultipleCreateTestJSON-1481152747-project-member] Lock "6af69c4f-4822-4170-94a0-cdc587c825f7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.347s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 953.776976] env[65918]: DEBUG nova.compute.manager [None req-83c5f202-7d7d-4f1a-83b6-697f2b8e6f7f tempest-MultipleCreateTestJSON-1481152747 tempest-MultipleCreateTestJSON-1481152747-project-member] [instance: efe8bbf3-f76d-4509-85d1-ffff559358b5] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 953.797258] env[65918]: DEBUG nova.compute.manager [None req-83c5f202-7d7d-4f1a-83b6-697f2b8e6f7f tempest-MultipleCreateTestJSON-1481152747 tempest-MultipleCreateTestJSON-1481152747-project-member] [instance: efe8bbf3-f76d-4509-85d1-ffff559358b5] Instance disappeared before build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 953.812931] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4c57f0d-a21b-46bf-9f06-d1093ceaaecc tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Lock "5c83d7da-f63b-40b7-a1aa-916ba9343439" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.166s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 953.820504] env[65918]: DEBUG oslo_concurrency.lockutils [None req-83c5f202-7d7d-4f1a-83b6-697f2b8e6f7f tempest-MultipleCreateTestJSON-1481152747 tempest-MultipleCreateTestJSON-1481152747-project-member] Lock "efe8bbf3-f76d-4509-85d1-ffff559358b5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.370s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 953.828854] env[65918]: DEBUG nova.compute.manager [None req-f0b21c67-50f6-4525-b223-8e361de16751 tempest-SecurityGroupsTestJSON-607972084 tempest-SecurityGroupsTestJSON-607972084-project-member] [instance: 1e1d75b1-b4c3-4c72-a8e9-4e2f1b5103d4] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 953.853141] env[65918]: DEBUG nova.compute.manager [None req-f0b21c67-50f6-4525-b223-8e361de16751 tempest-SecurityGroupsTestJSON-607972084 tempest-SecurityGroupsTestJSON-607972084-project-member] [instance: 1e1d75b1-b4c3-4c72-a8e9-4e2f1b5103d4] Instance disappeared before build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 953.873109] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f0b21c67-50f6-4525-b223-8e361de16751 tempest-SecurityGroupsTestJSON-607972084 tempest-SecurityGroupsTestJSON-607972084-project-member] Lock "1e1d75b1-b4c3-4c72-a8e9-4e2f1b5103d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 231.043s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 953.881339] env[65918]: DEBUG nova.compute.manager [None req-399b6463-c818-424e-806d-f644084c30ef tempest-FloatingIPsAssociationTestJSON-1535304117 tempest-FloatingIPsAssociationTestJSON-1535304117-project-member] [instance: b94157e3-2da8-4709-b1cf-b2bb14e0a6f3] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 953.907191] env[65918]: DEBUG nova.compute.manager [None req-399b6463-c818-424e-806d-f644084c30ef tempest-FloatingIPsAssociationTestJSON-1535304117 tempest-FloatingIPsAssociationTestJSON-1535304117-project-member] [instance: b94157e3-2da8-4709-b1cf-b2bb14e0a6f3] Instance disappeared before build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 953.930140] env[65918]: DEBUG oslo_concurrency.lockutils [None req-399b6463-c818-424e-806d-f644084c30ef tempest-FloatingIPsAssociationTestJSON-1535304117 tempest-FloatingIPsAssociationTestJSON-1535304117-project-member] Lock "b94157e3-2da8-4709-b1cf-b2bb14e0a6f3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.797s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 953.938204] env[65918]: DEBUG nova.compute.manager [None req-671abe19-9d2a-44a0-8b32-4e8a272e058d tempest-ImagesOneServerNegativeTestJSON-561303778 tempest-ImagesOneServerNegativeTestJSON-561303778-project-member] [instance: 529e8acc-2775-4eac-8e99-7e901a08f1d4] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 953.959485] env[65918]: DEBUG nova.compute.manager [None req-671abe19-9d2a-44a0-8b32-4e8a272e058d tempest-ImagesOneServerNegativeTestJSON-561303778 tempest-ImagesOneServerNegativeTestJSON-561303778-project-member] [instance: 529e8acc-2775-4eac-8e99-7e901a08f1d4] Instance disappeared before build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 953.979197] env[65918]: DEBUG oslo_concurrency.lockutils [None req-671abe19-9d2a-44a0-8b32-4e8a272e058d tempest-ImagesOneServerNegativeTestJSON-561303778 tempest-ImagesOneServerNegativeTestJSON-561303778-project-member] Lock "529e8acc-2775-4eac-8e99-7e901a08f1d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.642s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 953.987390] env[65918]: DEBUG nova.compute.manager [None req-13b00a8c-773b-4b30-9aa7-7392da7c463a tempest-ServerPasswordTestJSON-953704500 tempest-ServerPasswordTestJSON-953704500-project-member] [instance: f26e4561-b450-4582-a415-a90a4dda7837] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 954.009252] env[65918]: DEBUG nova.compute.manager [None req-13b00a8c-773b-4b30-9aa7-7392da7c463a tempest-ServerPasswordTestJSON-953704500 tempest-ServerPasswordTestJSON-953704500-project-member] [instance: f26e4561-b450-4582-a415-a90a4dda7837] Instance disappeared before build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 954.029400] env[65918]: DEBUG oslo_concurrency.lockutils [None req-13b00a8c-773b-4b30-9aa7-7392da7c463a tempest-ServerPasswordTestJSON-953704500 tempest-ServerPasswordTestJSON-953704500-project-member] Lock "f26e4561-b450-4582-a415-a90a4dda7837" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 221.182s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 954.038424] env[65918]: DEBUG nova.compute.manager [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 954.081565] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 954.081792] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 954.083623] env[65918]: INFO nova.compute.claims [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 954.224506] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d290af1b-dc88-4670-ba18-27a5a662c26d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 954.233129] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f23e3a05-5d78-46ec-a9d5-04d01967bb57 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 954.263624] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3c1b05f-d866-45bf-82f6-71e47a112c3e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 954.271599] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b609bae-9531-4ee6-8015-0cb183a76de1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 954.285425] env[65918]: DEBUG nova.compute.provider_tree [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 954.293841] env[65918]: DEBUG nova.scheduler.client.report [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 954.306640] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.225s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 954.307172] env[65918]: DEBUG nova.compute.manager [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 954.337265] env[65918]: DEBUG nova.compute.utils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 954.338803] env[65918]: DEBUG nova.compute.manager [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 954.338972] env[65918]: DEBUG nova.network.neutron [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 954.353596] env[65918]: DEBUG nova.compute.manager [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 954.401473] env[65918]: DEBUG nova.policy [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd37072cdd2aa4b0baeb22e3a42c12317', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '959758ea621441928f05c0522a7248df', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 954.422682] env[65918]: DEBUG nova.compute.manager [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 954.444121] env[65918]: DEBUG nova.virt.hardware [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 954.444375] env[65918]: DEBUG nova.virt.hardware [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 954.444534] env[65918]: DEBUG nova.virt.hardware [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 954.444740] env[65918]: DEBUG nova.virt.hardware [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 954.444901] env[65918]: DEBUG nova.virt.hardware [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 954.445128] env[65918]: DEBUG nova.virt.hardware [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 954.445255] env[65918]: DEBUG nova.virt.hardware [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 954.445409] env[65918]: DEBUG nova.virt.hardware [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 954.445570] env[65918]: DEBUG nova.virt.hardware [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 954.445729] env[65918]: DEBUG nova.virt.hardware [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 954.445899] env[65918]: DEBUG nova.virt.hardware [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 954.446975] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d547424-d346-4a27-ad67-0be50c6c964b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 954.453177] env[65918]: DEBUG oslo_concurrency.lockutils [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquiring lock "81fef129-8f9a-4a19-afc0-f27411c36159" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 954.453393] env[65918]: DEBUG oslo_concurrency.lockutils [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Lock "81fef129-8f9a-4a19-afc0-f27411c36159" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 954.458157] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d02deec9-9dc3-4755-98a7-6d44c423fabb {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 954.721556] env[65918]: DEBUG nova.network.neutron [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Successfully created port: 3a870bf7-eff5-48a4-8da7-e2d57e29a226 {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 955.257700] env[65918]: DEBUG nova.network.neutron [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Successfully updated port: 3a870bf7-eff5-48a4-8da7-e2d57e29a226 {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 955.266528] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Acquiring lock "refresh_cache-c9932955-3b82-4c30-9441-b33695340ed2" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 955.266644] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Acquired lock "refresh_cache-c9932955-3b82-4c30-9441-b33695340ed2" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 955.266838] env[65918]: DEBUG nova.network.neutron [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 955.302319] env[65918]: DEBUG nova.network.neutron [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 955.447521] env[65918]: DEBUG nova.network.neutron [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Updating instance_info_cache with network_info: [{"id": "3a870bf7-eff5-48a4-8da7-e2d57e29a226", "address": "fa:16:3e:7a:df:8e", "network": {"id": "f6ef8be4-8990-48f7-812e-4f324a776778", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-923208939-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "959758ea621441928f05c0522a7248df", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "99be9a5e-b3f9-4e6c-83d5-df11f817847d", "external-id": "nsx-vlan-transportzone-566", "segmentation_id": 566, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a870bf7-ef", "ovs_interfaceid": "3a870bf7-eff5-48a4-8da7-e2d57e29a226", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 955.459741] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Releasing lock "refresh_cache-c9932955-3b82-4c30-9441-b33695340ed2" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 955.460034] env[65918]: DEBUG nova.compute.manager [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Instance network_info: |[{"id": "3a870bf7-eff5-48a4-8da7-e2d57e29a226", "address": "fa:16:3e:7a:df:8e", "network": {"id": "f6ef8be4-8990-48f7-812e-4f324a776778", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-923208939-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "959758ea621441928f05c0522a7248df", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "99be9a5e-b3f9-4e6c-83d5-df11f817847d", "external-id": "nsx-vlan-transportzone-566", "segmentation_id": 566, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a870bf7-ef", "ovs_interfaceid": "3a870bf7-eff5-48a4-8da7-e2d57e29a226", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 955.460409] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7a:df:8e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '99be9a5e-b3f9-4e6c-83d5-df11f817847d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3a870bf7-eff5-48a4-8da7-e2d57e29a226', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 955.467918] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Creating folder: Project (959758ea621441928f05c0522a7248df). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 955.468432] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f273bf19-d060-4a64-b1ae-2289507c233b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 955.479238] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Created folder: Project (959758ea621441928f05c0522a7248df) in parent group-v572679. [ 955.479421] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Creating folder: Instances. Parent ref: group-v572730. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 955.479642] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-97482f99-d1e9-43f1-a88d-427bc0d45832 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 955.489258] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Created folder: Instances in parent group-v572730. [ 955.489491] env[65918]: DEBUG oslo.service.loopingcall [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 955.489666] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 955.489861] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2851c6cc-4bad-4c38-b8f3-fc4724bd4fad {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 955.510344] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 955.510344] env[65918]: value = "task-2848212" [ 955.510344] env[65918]: _type = "Task" [ 955.510344] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 955.518337] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848212, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 955.819356] env[65918]: DEBUG nova.compute.manager [req-c673eb6c-12c9-4eae-9bd0-3dbb1ff1a424 req-3d533b04-3d21-4664-863b-fc8b28abb0e7 service nova] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Received event network-vif-plugged-3a870bf7-eff5-48a4-8da7-e2d57e29a226 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 955.819356] env[65918]: DEBUG oslo_concurrency.lockutils [req-c673eb6c-12c9-4eae-9bd0-3dbb1ff1a424 req-3d533b04-3d21-4664-863b-fc8b28abb0e7 service nova] Acquiring lock "c9932955-3b82-4c30-9441-b33695340ed2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 955.819356] env[65918]: DEBUG oslo_concurrency.lockutils [req-c673eb6c-12c9-4eae-9bd0-3dbb1ff1a424 req-3d533b04-3d21-4664-863b-fc8b28abb0e7 service nova] Lock "c9932955-3b82-4c30-9441-b33695340ed2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 955.819356] env[65918]: DEBUG oslo_concurrency.lockutils [req-c673eb6c-12c9-4eae-9bd0-3dbb1ff1a424 req-3d533b04-3d21-4664-863b-fc8b28abb0e7 service nova] Lock "c9932955-3b82-4c30-9441-b33695340ed2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 955.819472] env[65918]: DEBUG nova.compute.manager [req-c673eb6c-12c9-4eae-9bd0-3dbb1ff1a424 req-3d533b04-3d21-4664-863b-fc8b28abb0e7 service nova] [instance: c9932955-3b82-4c30-9441-b33695340ed2] No waiting events found dispatching network-vif-plugged-3a870bf7-eff5-48a4-8da7-e2d57e29a226 {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 955.819611] env[65918]: WARNING nova.compute.manager [req-c673eb6c-12c9-4eae-9bd0-3dbb1ff1a424 req-3d533b04-3d21-4664-863b-fc8b28abb0e7 service nova] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Received unexpected event network-vif-plugged-3a870bf7-eff5-48a4-8da7-e2d57e29a226 for instance with vm_state building and task_state spawning. [ 955.819795] env[65918]: DEBUG nova.compute.manager [req-c673eb6c-12c9-4eae-9bd0-3dbb1ff1a424 req-3d533b04-3d21-4664-863b-fc8b28abb0e7 service nova] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Received event network-changed-3a870bf7-eff5-48a4-8da7-e2d57e29a226 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 955.819952] env[65918]: DEBUG nova.compute.manager [req-c673eb6c-12c9-4eae-9bd0-3dbb1ff1a424 req-3d533b04-3d21-4664-863b-fc8b28abb0e7 service nova] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Refreshing instance network info cache due to event network-changed-3a870bf7-eff5-48a4-8da7-e2d57e29a226. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 955.820145] env[65918]: DEBUG oslo_concurrency.lockutils [req-c673eb6c-12c9-4eae-9bd0-3dbb1ff1a424 req-3d533b04-3d21-4664-863b-fc8b28abb0e7 service nova] Acquiring lock "refresh_cache-c9932955-3b82-4c30-9441-b33695340ed2" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 955.820278] env[65918]: DEBUG oslo_concurrency.lockutils [req-c673eb6c-12c9-4eae-9bd0-3dbb1ff1a424 req-3d533b04-3d21-4664-863b-fc8b28abb0e7 service nova] Acquired lock "refresh_cache-c9932955-3b82-4c30-9441-b33695340ed2" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 955.820430] env[65918]: DEBUG nova.network.neutron [req-c673eb6c-12c9-4eae-9bd0-3dbb1ff1a424 req-3d533b04-3d21-4664-863b-fc8b28abb0e7 service nova] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Refreshing network info cache for port 3a870bf7-eff5-48a4-8da7-e2d57e29a226 {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 956.020362] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848212, 'name': CreateVM_Task, 'duration_secs': 0.312278} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 956.020517] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 956.021190] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 956.021322] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 956.021626] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 956.021859] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8d9ce876-c36e-4014-8aa5-642079e0cad1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 956.026069] env[65918]: DEBUG oslo_vmware.api [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Waiting for the task: (returnval){ [ 956.026069] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]526eacfe-e96d-d363-56bd-2f7669ce1b22" [ 956.026069] env[65918]: _type = "Task" [ 956.026069] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 956.035412] env[65918]: DEBUG oslo_vmware.api [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]526eacfe-e96d-d363-56bd-2f7669ce1b22, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 956.037134] env[65918]: DEBUG nova.network.neutron [req-c673eb6c-12c9-4eae-9bd0-3dbb1ff1a424 req-3d533b04-3d21-4664-863b-fc8b28abb0e7 service nova] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Updated VIF entry in instance network info cache for port 3a870bf7-eff5-48a4-8da7-e2d57e29a226. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 956.037456] env[65918]: DEBUG nova.network.neutron [req-c673eb6c-12c9-4eae-9bd0-3dbb1ff1a424 req-3d533b04-3d21-4664-863b-fc8b28abb0e7 service nova] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Updating instance_info_cache with network_info: [{"id": "3a870bf7-eff5-48a4-8da7-e2d57e29a226", "address": "fa:16:3e:7a:df:8e", "network": {"id": "f6ef8be4-8990-48f7-812e-4f324a776778", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-923208939-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "959758ea621441928f05c0522a7248df", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "99be9a5e-b3f9-4e6c-83d5-df11f817847d", "external-id": "nsx-vlan-transportzone-566", "segmentation_id": 566, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a870bf7-ef", "ovs_interfaceid": "3a870bf7-eff5-48a4-8da7-e2d57e29a226", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 956.048079] env[65918]: DEBUG oslo_concurrency.lockutils [req-c673eb6c-12c9-4eae-9bd0-3dbb1ff1a424 req-3d533b04-3d21-4664-863b-fc8b28abb0e7 service nova] Releasing lock "refresh_cache-c9932955-3b82-4c30-9441-b33695340ed2" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 956.536619] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 956.537018] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 956.537119] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 994.424074] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 994.424433] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 994.424469] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Starting heal instance info cache {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 994.424641] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Rebuilding the list of instances to heal {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 994.440648] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 994.440867] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 994.441082] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 994.441281] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 994.441470] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 994.441650] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Didn't find any instances for network info cache update. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 994.442228] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 995.424137] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager.update_available_resource {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 995.433389] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 995.433608] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 995.433772] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 995.433933] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65918) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 995.434976] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32628d48-460a-46dc-a91f-7e20930d8a28 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.443637] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9816b47e-36ae-4230-8cfa-7d53862bcc56 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.457151] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4336a0ae-13cb-49a1-b2fd-a08020ab4866 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.463150] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e54d8457-6380-4dd2-b1e2-d81e4cfcca5f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.493175] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180973MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65918) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 995.493175] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 995.493175] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 995.542538] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bd0158bd-e255-4680-b00e-81eb1ce88ad5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 995.542700] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 51163f89-c8b6-48a8-bbbe-de63c44d92a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 995.542829] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 995.542952] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bba6f3d9-1be3-4048-86d5-f435511b0fc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 995.543091] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance c9932955-3b82-4c30-9441-b33695340ed2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 995.553857] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 3faaaacf-815e-4493-81a7-2a32f868442a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 995.563616] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 8017964a-7fe8-40eb-a79d-47e0401a27d1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 995.573490] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance a0f06a58-65d2-4325-8f93-0948b4e5ac8c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 995.582510] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 81fef129-8f9a-4a19-afc0-f27411c36159 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 995.582717] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 995.582860] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=200GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 995.690589] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f5e709d-4994-402a-8941-8008810c2b26 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.697834] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1c2ee7d-4853-4a83-b4c7-f9d0813d620f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.728060] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9756f309-fa7c-4f0c-be95-8bff7f1f6f6e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.733973] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb5b298f-f305-4663-b05a-1ce15bf3d06c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.746533] env[65918]: DEBUG nova.compute.provider_tree [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 995.754468] env[65918]: DEBUG nova.scheduler.client.report [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 995.766686] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65918) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 995.766859] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.274s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 996.766984] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 996.767278] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 996.767469] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 996.767636] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65918) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 997.425421] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 998.424609] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 999.854349] env[65918]: WARNING oslo_vmware.rw_handles [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 999.854349] env[65918]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 999.854349] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 999.854349] env[65918]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 999.854349] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 999.854349] env[65918]: ERROR oslo_vmware.rw_handles response.begin() [ 999.854349] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 999.854349] env[65918]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 999.854349] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 999.854349] env[65918]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 999.854349] env[65918]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 999.854349] env[65918]: ERROR oslo_vmware.rw_handles [ 999.855022] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Downloaded image file data e017c336-3a02-4b58-874a-44a1d1e154fd to vmware_temp/ac2e0474-cf55-41fa-b631-d5dfe3c92248/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 999.856668] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Caching image {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 999.856916] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Copying Virtual Disk [datastore1] vmware_temp/ac2e0474-cf55-41fa-b631-d5dfe3c92248/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk to [datastore1] vmware_temp/ac2e0474-cf55-41fa-b631-d5dfe3c92248/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk {{(pid=65918) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 999.857201] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1dbcc975-774e-4eb7-b9d0-d5272a41a477 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.867692] env[65918]: DEBUG oslo_vmware.api [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Waiting for the task: (returnval){ [ 999.867692] env[65918]: value = "task-2848213" [ 999.867692] env[65918]: _type = "Task" [ 999.867692] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 999.875163] env[65918]: DEBUG oslo_vmware.api [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Task: {'id': task-2848213, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1000.378879] env[65918]: DEBUG oslo_vmware.exceptions [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Fault InvalidArgument not matched. {{(pid=65918) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1000.378879] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1000.378879] env[65918]: ERROR nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1000.378879] env[65918]: Faults: ['InvalidArgument'] [ 1000.378879] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Traceback (most recent call last): [ 1000.378879] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1000.378879] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] yield resources [ 1000.378879] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1000.378879] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] self.driver.spawn(context, instance, image_meta, [ 1000.379437] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1000.379437] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1000.379437] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1000.379437] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] self._fetch_image_if_missing(context, vi) [ 1000.379437] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1000.379437] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] image_cache(vi, tmp_image_ds_loc) [ 1000.379437] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1000.379437] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] vm_util.copy_virtual_disk( [ 1000.379437] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1000.379437] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] session._wait_for_task(vmdk_copy_task) [ 1000.379437] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1000.379437] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] return self.wait_for_task(task_ref) [ 1000.379437] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1000.379734] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] return evt.wait() [ 1000.379734] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1000.379734] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] result = hub.switch() [ 1000.379734] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1000.379734] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] return self.greenlet.switch() [ 1000.379734] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1000.379734] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] self.f(*self.args, **self.kw) [ 1000.379734] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1000.379734] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] raise exceptions.translate_fault(task_info.error) [ 1000.379734] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1000.379734] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Faults: ['InvalidArgument'] [ 1000.379734] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] [ 1000.379734] env[65918]: INFO nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Terminating instance [ 1000.380607] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1000.380851] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1000.381056] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e4eb17c4-e49b-4bc8-90fd-b19bfb52acdb {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.383273] env[65918]: DEBUG nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1000.384149] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1000.384242] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58063ffa-e2fa-4986-9e69-84ffa4116fa1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.391104] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1000.391330] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fe51361a-91ae-4a12-8ce4-d2f7d4de94ca {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.393568] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1000.393780] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1000.394694] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7d7633a2-64f8-4871-9be9-926e127cff96 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.399881] env[65918]: DEBUG oslo_vmware.api [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Waiting for the task: (returnval){ [ 1000.399881] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52c2ed1a-e2f5-292f-08d5-84a1f1470083" [ 1000.399881] env[65918]: _type = "Task" [ 1000.399881] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1000.407283] env[65918]: DEBUG oslo_vmware.api [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52c2ed1a-e2f5-292f-08d5-84a1f1470083, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1000.457358] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1000.457577] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1000.457754] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Deleting the datastore file [datastore1] bd0158bd-e255-4680-b00e-81eb1ce88ad5 {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1000.458031] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-eef90586-42f0-49be-be05-2bb981526f1a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.464980] env[65918]: DEBUG oslo_vmware.api [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Waiting for the task: (returnval){ [ 1000.464980] env[65918]: value = "task-2848215" [ 1000.464980] env[65918]: _type = "Task" [ 1000.464980] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1000.472565] env[65918]: DEBUG oslo_vmware.api [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Task: {'id': task-2848215, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1000.910469] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1000.910786] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Creating directory with path [datastore1] vmware_temp/63eb7173-c5b1-48db-83d8-9184fac5cd63/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1000.910965] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b26892c2-d581-405d-a1a0-a4cf8b1b9ed0 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.922176] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Created directory with path [datastore1] vmware_temp/63eb7173-c5b1-48db-83d8-9184fac5cd63/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1000.922445] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Fetch image to [datastore1] vmware_temp/63eb7173-c5b1-48db-83d8-9184fac5cd63/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1000.922638] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/63eb7173-c5b1-48db-83d8-9184fac5cd63/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1000.923358] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-460a8904-5c1e-43ec-8de2-bb7d6271d924 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.929811] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d32a8e7e-a65d-4c62-889a-e13915dd3514 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.938710] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a130a35-4512-4eb6-8a16-23b9fad91070 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.972567] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e3f6148-be26-47f2-a8d3-043e9b06a64c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.980154] env[65918]: DEBUG oslo_vmware.api [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Task: {'id': task-2848215, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068869} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1000.981526] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1000.981712] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1000.981878] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1000.982064] env[65918]: INFO nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1000.983764] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-738dccfe-ec5f-4679-9caa-35144c87ed34 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.985566] env[65918]: DEBUG nova.compute.claims [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1000.985737] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1000.985943] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1001.007531] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1001.057955] env[65918]: DEBUG oslo_vmware.rw_handles [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/63eb7173-c5b1-48db-83d8-9184fac5cd63/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1001.117062] env[65918]: DEBUG oslo_vmware.rw_handles [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Completed reading data from the image iterator. {{(pid=65918) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1001.117062] env[65918]: DEBUG oslo_vmware.rw_handles [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/63eb7173-c5b1-48db-83d8-9184fac5cd63/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1001.183271] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab18e9da-69a6-4a72-9557-85c6a46ac900 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.190975] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4a8c907-8432-4881-9969-383109263835 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.220485] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1332e1c3-6cda-4d8e-9c1f-f1c2cd6ca93d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.227787] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a253976a-5d79-4082-b9b5-e02b38bbd99a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.241304] env[65918]: DEBUG nova.compute.provider_tree [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1001.249326] env[65918]: DEBUG nova.scheduler.client.report [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1001.262430] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.276s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1001.262954] env[65918]: ERROR nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1001.262954] env[65918]: Faults: ['InvalidArgument'] [ 1001.262954] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Traceback (most recent call last): [ 1001.262954] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1001.262954] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] self.driver.spawn(context, instance, image_meta, [ 1001.262954] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1001.262954] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1001.262954] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1001.262954] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] self._fetch_image_if_missing(context, vi) [ 1001.262954] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1001.262954] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] image_cache(vi, tmp_image_ds_loc) [ 1001.262954] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1001.263306] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] vm_util.copy_virtual_disk( [ 1001.263306] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1001.263306] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] session._wait_for_task(vmdk_copy_task) [ 1001.263306] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1001.263306] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] return self.wait_for_task(task_ref) [ 1001.263306] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1001.263306] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] return evt.wait() [ 1001.263306] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1001.263306] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] result = hub.switch() [ 1001.263306] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1001.263306] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] return self.greenlet.switch() [ 1001.263306] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1001.263306] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] self.f(*self.args, **self.kw) [ 1001.263658] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1001.263658] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] raise exceptions.translate_fault(task_info.error) [ 1001.263658] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1001.263658] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Faults: ['InvalidArgument'] [ 1001.263658] env[65918]: ERROR nova.compute.manager [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] [ 1001.263658] env[65918]: DEBUG nova.compute.utils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] VimFaultException {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1001.265039] env[65918]: DEBUG nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Build of instance bd0158bd-e255-4680-b00e-81eb1ce88ad5 was re-scheduled: A specified parameter was not correct: fileType [ 1001.265039] env[65918]: Faults: ['InvalidArgument'] {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1001.265433] env[65918]: DEBUG nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1001.265614] env[65918]: DEBUG nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1001.265906] env[65918]: DEBUG nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1001.265969] env[65918]: DEBUG nova.network.neutron [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1001.584517] env[65918]: DEBUG nova.network.neutron [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1001.594627] env[65918]: INFO nova.compute.manager [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Took 0.33 seconds to deallocate network for instance. [ 1001.677691] env[65918]: INFO nova.scheduler.client.report [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Deleted allocations for instance bd0158bd-e255-4680-b00e-81eb1ce88ad5 [ 1001.693071] env[65918]: DEBUG oslo_concurrency.lockutils [None req-8788c587-20c4-44ef-be07-8fcac96cb482 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "bd0158bd-e255-4680-b00e-81eb1ce88ad5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 372.909s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1001.694158] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3e30df49-8ce5-4b43-90a2-b678517e6b00 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "bd0158bd-e255-4680-b00e-81eb1ce88ad5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 175.335s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1001.694398] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3e30df49-8ce5-4b43-90a2-b678517e6b00 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Acquiring lock "bd0158bd-e255-4680-b00e-81eb1ce88ad5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1001.694610] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3e30df49-8ce5-4b43-90a2-b678517e6b00 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "bd0158bd-e255-4680-b00e-81eb1ce88ad5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1001.694770] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3e30df49-8ce5-4b43-90a2-b678517e6b00 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "bd0158bd-e255-4680-b00e-81eb1ce88ad5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1001.696645] env[65918]: INFO nova.compute.manager [None req-3e30df49-8ce5-4b43-90a2-b678517e6b00 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Terminating instance [ 1001.698282] env[65918]: DEBUG nova.compute.manager [None req-3e30df49-8ce5-4b43-90a2-b678517e6b00 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1001.698468] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-3e30df49-8ce5-4b43-90a2-b678517e6b00 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1001.698918] env[65918]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-60d11f7c-6e97-4ded-943a-f18d1f4cc407 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.707766] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39e31337-87e2-4be1-9825-f14460add7ad {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.718611] env[65918]: DEBUG nova.compute.manager [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1001.738570] env[65918]: WARNING nova.virt.vmwareapi.vmops [None req-3e30df49-8ce5-4b43-90a2-b678517e6b00 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bd0158bd-e255-4680-b00e-81eb1ce88ad5 could not be found. [ 1001.738766] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-3e30df49-8ce5-4b43-90a2-b678517e6b00 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1001.738937] env[65918]: INFO nova.compute.manager [None req-3e30df49-8ce5-4b43-90a2-b678517e6b00 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1001.739216] env[65918]: DEBUG oslo.service.loopingcall [None req-3e30df49-8ce5-4b43-90a2-b678517e6b00 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1001.739446] env[65918]: DEBUG nova.compute.manager [-] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1001.739543] env[65918]: DEBUG nova.network.neutron [-] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1001.761440] env[65918]: DEBUG nova.network.neutron [-] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1001.766184] env[65918]: DEBUG oslo_concurrency.lockutils [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1001.766385] env[65918]: DEBUG oslo_concurrency.lockutils [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1001.767734] env[65918]: INFO nova.compute.claims [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1001.770587] env[65918]: INFO nova.compute.manager [-] [instance: bd0158bd-e255-4680-b00e-81eb1ce88ad5] Took 0.03 seconds to deallocate network for instance. [ 1001.852717] env[65918]: DEBUG oslo_concurrency.lockutils [None req-3e30df49-8ce5-4b43-90a2-b678517e6b00 tempest-ServerRescueNegativeTestJSON-628996508 tempest-ServerRescueNegativeTestJSON-628996508-project-member] Lock "bd0158bd-e255-4680-b00e-81eb1ce88ad5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.159s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1001.906158] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8664765b-96a4-4d83-8fa5-ecd5672451e1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.913745] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8141341d-16de-4ab3-933f-01b79762150d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.944165] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-baa3eae4-74fd-402c-bc4e-343e9104b0ee {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.951191] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9390450b-757f-4dd6-acfe-4b68573cd2da {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.964135] env[65918]: DEBUG nova.compute.provider_tree [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1001.973044] env[65918]: DEBUG nova.scheduler.client.report [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1001.985154] env[65918]: DEBUG oslo_concurrency.lockutils [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.219s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1001.985642] env[65918]: DEBUG nova.compute.manager [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1002.016582] env[65918]: DEBUG nova.compute.utils [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1002.016906] env[65918]: DEBUG nova.compute.manager [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1002.017091] env[65918]: DEBUG nova.network.neutron [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1002.025356] env[65918]: DEBUG nova.compute.manager [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1002.071898] env[65918]: DEBUG nova.policy [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ad384f86d8e24122aea18fda15406e5a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '59b7b046facd4e8dabf95efe542224af', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 1002.087435] env[65918]: DEBUG nova.compute.manager [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1002.110539] env[65918]: DEBUG nova.virt.hardware [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1002.110765] env[65918]: DEBUG nova.virt.hardware [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1002.110999] env[65918]: DEBUG nova.virt.hardware [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1002.111263] env[65918]: DEBUG nova.virt.hardware [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1002.111423] env[65918]: DEBUG nova.virt.hardware [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1002.111564] env[65918]: DEBUG nova.virt.hardware [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1002.111766] env[65918]: DEBUG nova.virt.hardware [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1002.111917] env[65918]: DEBUG nova.virt.hardware [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1002.112099] env[65918]: DEBUG nova.virt.hardware [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1002.112252] env[65918]: DEBUG nova.virt.hardware [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1002.112414] env[65918]: DEBUG nova.virt.hardware [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1002.113366] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b619316b-f64b-4c59-b3f5-06cde59eff73 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.121573] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e67ed0e5-9b19-41b1-bb8d-5469c6d782a1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.382036] env[65918]: DEBUG nova.network.neutron [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Successfully created port: 7486ddef-cbd9-4389-b4ab-91605f65fdc1 {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1003.031418] env[65918]: DEBUG nova.network.neutron [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Successfully updated port: 7486ddef-cbd9-4389-b4ab-91605f65fdc1 {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1003.039827] env[65918]: DEBUG oslo_concurrency.lockutils [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Acquiring lock "refresh_cache-3faaaacf-815e-4493-81a7-2a32f868442a" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1003.039894] env[65918]: DEBUG oslo_concurrency.lockutils [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Acquired lock "refresh_cache-3faaaacf-815e-4493-81a7-2a32f868442a" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1003.040218] env[65918]: DEBUG nova.network.neutron [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1003.077542] env[65918]: DEBUG nova.network.neutron [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1003.233265] env[65918]: DEBUG nova.network.neutron [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Updating instance_info_cache with network_info: [{"id": "7486ddef-cbd9-4389-b4ab-91605f65fdc1", "address": "fa:16:3e:d9:2b:96", "network": {"id": "1723af0b-5a91-4aca-9bf3-cf44974c0cb0", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1722787191-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "59b7b046facd4e8dabf95efe542224af", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e31a7f15-a808-4199-9071-31fd05e316ea", "external-id": "nsx-vlan-transportzone-388", "segmentation_id": 388, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7486ddef-cb", "ovs_interfaceid": "7486ddef-cbd9-4389-b4ab-91605f65fdc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1003.243648] env[65918]: DEBUG oslo_concurrency.lockutils [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Releasing lock "refresh_cache-3faaaacf-815e-4493-81a7-2a32f868442a" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1003.243983] env[65918]: DEBUG nova.compute.manager [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Instance network_info: |[{"id": "7486ddef-cbd9-4389-b4ab-91605f65fdc1", "address": "fa:16:3e:d9:2b:96", "network": {"id": "1723af0b-5a91-4aca-9bf3-cf44974c0cb0", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1722787191-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "59b7b046facd4e8dabf95efe542224af", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e31a7f15-a808-4199-9071-31fd05e316ea", "external-id": "nsx-vlan-transportzone-388", "segmentation_id": 388, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7486ddef-cb", "ovs_interfaceid": "7486ddef-cbd9-4389-b4ab-91605f65fdc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1003.244380] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d9:2b:96', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e31a7f15-a808-4199-9071-31fd05e316ea', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7486ddef-cbd9-4389-b4ab-91605f65fdc1', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1003.252234] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Creating folder: Project (59b7b046facd4e8dabf95efe542224af). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1003.252749] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-932cac2d-5992-44b0-9c39-da427d616bb0 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.264473] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Created folder: Project (59b7b046facd4e8dabf95efe542224af) in parent group-v572679. [ 1003.264660] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Creating folder: Instances. Parent ref: group-v572733. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1003.264876] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-aca76583-c251-4dd1-a7e8-a3581bb5d902 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.273403] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Created folder: Instances in parent group-v572733. [ 1003.273626] env[65918]: DEBUG oslo.service.loopingcall [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1003.273806] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1003.273996] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6ee12b98-bed4-48f3-a90c-8efdf4ed0d14 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.293223] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1003.293223] env[65918]: value = "task-2848218" [ 1003.293223] env[65918]: _type = "Task" [ 1003.293223] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1003.300545] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848218, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1003.804564] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848218, 'name': CreateVM_Task, 'duration_secs': 0.308271} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1003.804767] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1003.805407] env[65918]: DEBUG oslo_concurrency.lockutils [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1003.805569] env[65918]: DEBUG oslo_concurrency.lockutils [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1003.805866] env[65918]: DEBUG oslo_concurrency.lockutils [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1003.806120] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-816c319c-8496-4edf-bd18-b8df4c0a1a6a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.810553] env[65918]: DEBUG oslo_vmware.api [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Waiting for the task: (returnval){ [ 1003.810553] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]521ada49-4c89-f998-ec2d-1914089afedf" [ 1003.810553] env[65918]: _type = "Task" [ 1003.810553] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1003.818343] env[65918]: DEBUG oslo_vmware.api [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]521ada49-4c89-f998-ec2d-1914089afedf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1003.842315] env[65918]: DEBUG nova.compute.manager [req-bb38dd32-fd5f-4961-b0a7-a61a0e2fd719 req-689b2491-d2e4-4ebe-861a-9db2acef15f9 service nova] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Received event network-vif-plugged-7486ddef-cbd9-4389-b4ab-91605f65fdc1 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1003.842610] env[65918]: DEBUG oslo_concurrency.lockutils [req-bb38dd32-fd5f-4961-b0a7-a61a0e2fd719 req-689b2491-d2e4-4ebe-861a-9db2acef15f9 service nova] Acquiring lock "3faaaacf-815e-4493-81a7-2a32f868442a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1003.842805] env[65918]: DEBUG oslo_concurrency.lockutils [req-bb38dd32-fd5f-4961-b0a7-a61a0e2fd719 req-689b2491-d2e4-4ebe-861a-9db2acef15f9 service nova] Lock "3faaaacf-815e-4493-81a7-2a32f868442a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1003.842855] env[65918]: DEBUG oslo_concurrency.lockutils [req-bb38dd32-fd5f-4961-b0a7-a61a0e2fd719 req-689b2491-d2e4-4ebe-861a-9db2acef15f9 service nova] Lock "3faaaacf-815e-4493-81a7-2a32f868442a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1003.843031] env[65918]: DEBUG nova.compute.manager [req-bb38dd32-fd5f-4961-b0a7-a61a0e2fd719 req-689b2491-d2e4-4ebe-861a-9db2acef15f9 service nova] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] No waiting events found dispatching network-vif-plugged-7486ddef-cbd9-4389-b4ab-91605f65fdc1 {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1003.843175] env[65918]: WARNING nova.compute.manager [req-bb38dd32-fd5f-4961-b0a7-a61a0e2fd719 req-689b2491-d2e4-4ebe-861a-9db2acef15f9 service nova] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Received unexpected event network-vif-plugged-7486ddef-cbd9-4389-b4ab-91605f65fdc1 for instance with vm_state building and task_state spawning. [ 1003.843328] env[65918]: DEBUG nova.compute.manager [req-bb38dd32-fd5f-4961-b0a7-a61a0e2fd719 req-689b2491-d2e4-4ebe-861a-9db2acef15f9 service nova] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Received event network-changed-7486ddef-cbd9-4389-b4ab-91605f65fdc1 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1003.843484] env[65918]: DEBUG nova.compute.manager [req-bb38dd32-fd5f-4961-b0a7-a61a0e2fd719 req-689b2491-d2e4-4ebe-861a-9db2acef15f9 service nova] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Refreshing instance network info cache due to event network-changed-7486ddef-cbd9-4389-b4ab-91605f65fdc1. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1003.843681] env[65918]: DEBUG oslo_concurrency.lockutils [req-bb38dd32-fd5f-4961-b0a7-a61a0e2fd719 req-689b2491-d2e4-4ebe-861a-9db2acef15f9 service nova] Acquiring lock "refresh_cache-3faaaacf-815e-4493-81a7-2a32f868442a" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1003.843812] env[65918]: DEBUG oslo_concurrency.lockutils [req-bb38dd32-fd5f-4961-b0a7-a61a0e2fd719 req-689b2491-d2e4-4ebe-861a-9db2acef15f9 service nova] Acquired lock "refresh_cache-3faaaacf-815e-4493-81a7-2a32f868442a" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1003.843962] env[65918]: DEBUG nova.network.neutron [req-bb38dd32-fd5f-4961-b0a7-a61a0e2fd719 req-689b2491-d2e4-4ebe-861a-9db2acef15f9 service nova] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Refreshing network info cache for port 7486ddef-cbd9-4389-b4ab-91605f65fdc1 {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1004.245988] env[65918]: DEBUG nova.network.neutron [req-bb38dd32-fd5f-4961-b0a7-a61a0e2fd719 req-689b2491-d2e4-4ebe-861a-9db2acef15f9 service nova] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Updated VIF entry in instance network info cache for port 7486ddef-cbd9-4389-b4ab-91605f65fdc1. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1004.246418] env[65918]: DEBUG nova.network.neutron [req-bb38dd32-fd5f-4961-b0a7-a61a0e2fd719 req-689b2491-d2e4-4ebe-861a-9db2acef15f9 service nova] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Updating instance_info_cache with network_info: [{"id": "7486ddef-cbd9-4389-b4ab-91605f65fdc1", "address": "fa:16:3e:d9:2b:96", "network": {"id": "1723af0b-5a91-4aca-9bf3-cf44974c0cb0", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1722787191-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "59b7b046facd4e8dabf95efe542224af", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e31a7f15-a808-4199-9071-31fd05e316ea", "external-id": "nsx-vlan-transportzone-388", "segmentation_id": 388, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7486ddef-cb", "ovs_interfaceid": "7486ddef-cbd9-4389-b4ab-91605f65fdc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1004.256057] env[65918]: DEBUG oslo_concurrency.lockutils [req-bb38dd32-fd5f-4961-b0a7-a61a0e2fd719 req-689b2491-d2e4-4ebe-861a-9db2acef15f9 service nova] Releasing lock "refresh_cache-3faaaacf-815e-4493-81a7-2a32f868442a" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1004.320958] env[65918]: DEBUG oslo_concurrency.lockutils [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1004.320958] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1004.321148] env[65918]: DEBUG oslo_concurrency.lockutils [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1018.772346] env[65918]: DEBUG nova.compute.manager [req-da5a5bca-bddd-49a8-96e4-5aa009b98627 req-a509bfda-2d5a-4076-9542-07e7fc07b89a service nova] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Received event network-vif-deleted-3a870bf7-eff5-48a4-8da7-e2d57e29a226 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1046.888835] env[65918]: WARNING oslo_vmware.rw_handles [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1046.888835] env[65918]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1046.888835] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1046.888835] env[65918]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1046.888835] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1046.888835] env[65918]: ERROR oslo_vmware.rw_handles response.begin() [ 1046.888835] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1046.888835] env[65918]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1046.888835] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1046.888835] env[65918]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1046.888835] env[65918]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1046.888835] env[65918]: ERROR oslo_vmware.rw_handles [ 1046.889556] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Downloaded image file data e017c336-3a02-4b58-874a-44a1d1e154fd to vmware_temp/63eb7173-c5b1-48db-83d8-9184fac5cd63/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1046.891157] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Caching image {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1046.891410] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Copying Virtual Disk [datastore1] vmware_temp/63eb7173-c5b1-48db-83d8-9184fac5cd63/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk to [datastore1] vmware_temp/63eb7173-c5b1-48db-83d8-9184fac5cd63/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk {{(pid=65918) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1046.891705] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f3456f28-1c0c-4c9a-9728-57c13222b0f5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1046.900619] env[65918]: DEBUG oslo_vmware.api [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Waiting for the task: (returnval){ [ 1046.900619] env[65918]: value = "task-2848219" [ 1046.900619] env[65918]: _type = "Task" [ 1046.900619] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1046.908523] env[65918]: DEBUG oslo_vmware.api [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Task: {'id': task-2848219, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1047.411200] env[65918]: DEBUG oslo_vmware.exceptions [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Fault InvalidArgument not matched. {{(pid=65918) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1047.411432] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1047.411974] env[65918]: ERROR nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1047.411974] env[65918]: Faults: ['InvalidArgument'] [ 1047.411974] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Traceback (most recent call last): [ 1047.411974] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1047.411974] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] yield resources [ 1047.411974] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1047.411974] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] self.driver.spawn(context, instance, image_meta, [ 1047.411974] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1047.411974] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1047.411974] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1047.411974] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] self._fetch_image_if_missing(context, vi) [ 1047.411974] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1047.412434] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] image_cache(vi, tmp_image_ds_loc) [ 1047.412434] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1047.412434] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] vm_util.copy_virtual_disk( [ 1047.412434] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1047.412434] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] session._wait_for_task(vmdk_copy_task) [ 1047.412434] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1047.412434] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] return self.wait_for_task(task_ref) [ 1047.412434] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1047.412434] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] return evt.wait() [ 1047.412434] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1047.412434] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] result = hub.switch() [ 1047.412434] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1047.412434] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] return self.greenlet.switch() [ 1047.412827] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1047.412827] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] self.f(*self.args, **self.kw) [ 1047.412827] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1047.412827] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] raise exceptions.translate_fault(task_info.error) [ 1047.412827] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1047.412827] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Faults: ['InvalidArgument'] [ 1047.412827] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] [ 1047.412827] env[65918]: INFO nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Terminating instance [ 1047.413878] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1047.414102] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1047.414325] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7f2bf7d4-b6a4-48c4-87bf-25c92a32414a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.416381] env[65918]: DEBUG nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1047.416571] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1047.417275] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97989ecf-4027-478c-8bb8-821f93140412 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.424128] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1047.424384] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-66b72873-a9c0-433d-8dab-baa25c18116b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.426384] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1047.426561] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1047.427455] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5e55db90-5ea6-444d-9658-b62ecc45f74f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.431957] env[65918]: DEBUG oslo_vmware.api [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Waiting for the task: (returnval){ [ 1047.431957] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]525546f6-1745-0bdb-1d24-7f4dd43ba3b3" [ 1047.431957] env[65918]: _type = "Task" [ 1047.431957] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1047.442517] env[65918]: DEBUG oslo_vmware.api [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]525546f6-1745-0bdb-1d24-7f4dd43ba3b3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1047.488301] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1047.488535] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1047.488784] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Deleting the datastore file [datastore1] 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1047.489063] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0e6111d1-b02f-4c11-935e-9fcc739b715a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.495718] env[65918]: DEBUG oslo_vmware.api [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Waiting for the task: (returnval){ [ 1047.495718] env[65918]: value = "task-2848221" [ 1047.495718] env[65918]: _type = "Task" [ 1047.495718] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1047.504453] env[65918]: DEBUG oslo_vmware.api [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Task: {'id': task-2848221, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1047.942476] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1047.942893] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Creating directory with path [datastore1] vmware_temp/cee9d159-3f75-4c70-b48d-d1b223c84809/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1047.942991] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b28cb423-42b4-4aa3-af86-b6df8dcce221 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.954564] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Created directory with path [datastore1] vmware_temp/cee9d159-3f75-4c70-b48d-d1b223c84809/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1047.954753] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Fetch image to [datastore1] vmware_temp/cee9d159-3f75-4c70-b48d-d1b223c84809/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1047.954922] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/cee9d159-3f75-4c70-b48d-d1b223c84809/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1047.955679] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d96ddd1c-8afd-406f-8f66-578a8ee4af69 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.962219] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68ae4ddd-b86f-4089-92b5-975d616006b0 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.971037] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8132c2e-82ee-4d0e-ac9e-a496fd650470 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.003794] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9afdf208-1ad9-483b-9ccc-3b051b0c79d1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.010811] env[65918]: DEBUG oslo_vmware.api [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Task: {'id': task-2848221, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080623} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1048.012316] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1048.012510] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1048.012685] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1048.012857] env[65918]: INFO nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1048.014604] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-124eb369-c38d-4ce6-abbe-3ad4baebff9a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.016534] env[65918]: DEBUG nova.compute.claims [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1048.016716] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1048.016928] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1048.039190] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1048.085382] env[65918]: DEBUG oslo_vmware.rw_handles [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cee9d159-3f75-4c70-b48d-d1b223c84809/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1048.143970] env[65918]: DEBUG oslo_vmware.rw_handles [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Completed reading data from the image iterator. {{(pid=65918) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1048.144180] env[65918]: DEBUG oslo_vmware.rw_handles [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cee9d159-3f75-4c70-b48d-d1b223c84809/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1048.197227] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59c501cf-0512-4cec-859c-811798b12e2e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.205094] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dabf6e24-5050-456b-8016-1ffa9a4e4b0e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.237492] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88041df7-72f8-4525-ab66-2f8d9f28a18f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.244187] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cd2779c-b02d-4042-9f1c-948fdb81b197 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.256905] env[65918]: DEBUG nova.compute.provider_tree [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1048.265558] env[65918]: DEBUG nova.scheduler.client.report [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1048.278309] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.261s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1048.278936] env[65918]: ERROR nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1048.278936] env[65918]: Faults: ['InvalidArgument'] [ 1048.278936] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Traceback (most recent call last): [ 1048.278936] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1048.278936] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] self.driver.spawn(context, instance, image_meta, [ 1048.278936] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1048.278936] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1048.278936] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1048.278936] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] self._fetch_image_if_missing(context, vi) [ 1048.278936] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1048.278936] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] image_cache(vi, tmp_image_ds_loc) [ 1048.278936] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1048.279371] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] vm_util.copy_virtual_disk( [ 1048.279371] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1048.279371] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] session._wait_for_task(vmdk_copy_task) [ 1048.279371] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1048.279371] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] return self.wait_for_task(task_ref) [ 1048.279371] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1048.279371] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] return evt.wait() [ 1048.279371] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1048.279371] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] result = hub.switch() [ 1048.279371] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1048.279371] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] return self.greenlet.switch() [ 1048.279371] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1048.279371] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] self.f(*self.args, **self.kw) [ 1048.279690] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1048.279690] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] raise exceptions.translate_fault(task_info.error) [ 1048.279690] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1048.279690] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Faults: ['InvalidArgument'] [ 1048.279690] env[65918]: ERROR nova.compute.manager [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] [ 1048.279690] env[65918]: DEBUG nova.compute.utils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] VimFaultException {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1048.280989] env[65918]: DEBUG nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Build of instance 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c was re-scheduled: A specified parameter was not correct: fileType [ 1048.280989] env[65918]: Faults: ['InvalidArgument'] {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1048.281381] env[65918]: DEBUG nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1048.281550] env[65918]: DEBUG nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1048.281702] env[65918]: DEBUG nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1048.281861] env[65918]: DEBUG nova.network.neutron [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1048.544784] env[65918]: DEBUG nova.network.neutron [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1048.556797] env[65918]: INFO nova.compute.manager [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Took 0.27 seconds to deallocate network for instance. [ 1048.644078] env[65918]: INFO nova.scheduler.client.report [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Deleted allocations for instance 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c [ 1048.659478] env[65918]: DEBUG oslo_concurrency.lockutils [None req-2f6a9d56-0872-4146-a3f4-e5dc7f1b2900 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Lock "32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 417.440s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1048.660575] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b5b4e98e-5beb-4767-b693-6ae11076c6f2 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Lock "32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 218.679s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1048.660785] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b5b4e98e-5beb-4767-b693-6ae11076c6f2 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Acquiring lock "32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1048.660989] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b5b4e98e-5beb-4767-b693-6ae11076c6f2 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Lock "32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1048.661170] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b5b4e98e-5beb-4767-b693-6ae11076c6f2 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Lock "32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1048.663025] env[65918]: INFO nova.compute.manager [None req-b5b4e98e-5beb-4767-b693-6ae11076c6f2 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Terminating instance [ 1048.667468] env[65918]: DEBUG nova.compute.manager [None req-b5b4e98e-5beb-4767-b693-6ae11076c6f2 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1048.667468] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-b5b4e98e-5beb-4767-b693-6ae11076c6f2 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1048.667703] env[65918]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-eb0d1f0c-d939-4e98-849c-95bbd3745548 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.676758] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48aefa56-d5a8-422d-bb35-cfaa791bd39b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.687287] env[65918]: DEBUG nova.compute.manager [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1048.708259] env[65918]: WARNING nova.virt.vmwareapi.vmops [None req-b5b4e98e-5beb-4767-b693-6ae11076c6f2 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c could not be found. [ 1048.709037] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-b5b4e98e-5beb-4767-b693-6ae11076c6f2 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1048.709037] env[65918]: INFO nova.compute.manager [None req-b5b4e98e-5beb-4767-b693-6ae11076c6f2 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1048.709169] env[65918]: DEBUG oslo.service.loopingcall [None req-b5b4e98e-5beb-4767-b693-6ae11076c6f2 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1048.709262] env[65918]: DEBUG nova.compute.manager [-] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1048.709357] env[65918]: DEBUG nova.network.neutron [-] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1048.733530] env[65918]: DEBUG nova.network.neutron [-] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1048.737759] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1048.737946] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1048.739729] env[65918]: INFO nova.compute.claims [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1048.742974] env[65918]: INFO nova.compute.manager [-] [instance: 32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c] Took 0.03 seconds to deallocate network for instance. [ 1048.823627] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b5b4e98e-5beb-4767-b693-6ae11076c6f2 tempest-ServersAdminNegativeTestJSON-315742385 tempest-ServersAdminNegativeTestJSON-315742385-project-member] Lock "32f1575d-fb8a-4b7d-a7aa-8b1142d6b75c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.163s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1048.865020] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abf8749e-9c08-4fa9-91d0-0c8f2d28d55e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.872852] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f39f00bb-ac86-49c2-aad1-dd126562c497 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.902464] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f6984ab-e540-4dbf-97e6-04ef37d9a083 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.909150] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ff647ff-c044-4baf-9772-2f9e7f078226 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.921836] env[65918]: DEBUG nova.compute.provider_tree [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1048.930349] env[65918]: DEBUG nova.scheduler.client.report [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1048.943422] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.205s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1048.943882] env[65918]: DEBUG nova.compute.manager [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1048.978325] env[65918]: DEBUG nova.compute.utils [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1048.980282] env[65918]: DEBUG nova.compute.manager [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1048.980458] env[65918]: DEBUG nova.network.neutron [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1048.988069] env[65918]: DEBUG nova.compute.manager [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1049.033493] env[65918]: DEBUG nova.policy [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '54fcd3cd71af4d71aa2430fec8eada1c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '35faa253c2a243e2851f147c1e0ee4d5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 1049.047228] env[65918]: DEBUG nova.compute.manager [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1049.068277] env[65918]: DEBUG nova.virt.hardware [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1049.068528] env[65918]: DEBUG nova.virt.hardware [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1049.068714] env[65918]: DEBUG nova.virt.hardware [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1049.068905] env[65918]: DEBUG nova.virt.hardware [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1049.069062] env[65918]: DEBUG nova.virt.hardware [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1049.069209] env[65918]: DEBUG nova.virt.hardware [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1049.069412] env[65918]: DEBUG nova.virt.hardware [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1049.069571] env[65918]: DEBUG nova.virt.hardware [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1049.069730] env[65918]: DEBUG nova.virt.hardware [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1049.069891] env[65918]: DEBUG nova.virt.hardware [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1049.070075] env[65918]: DEBUG nova.virt.hardware [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1049.070899] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7f5e4eb-455a-4c33-8d4a-cd1d1ffe485b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.078211] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0af4ea79-4ec1-40de-a865-9466f01b5ee3 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.318271] env[65918]: DEBUG nova.network.neutron [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Successfully created port: e887b448-d7f4-4ed0-90d8-5d8516e6ca78 {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1049.826487] env[65918]: DEBUG nova.network.neutron [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Successfully updated port: e887b448-d7f4-4ed0-90d8-5d8516e6ca78 {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1049.838933] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Acquiring lock "refresh_cache-8017964a-7fe8-40eb-a79d-47e0401a27d1" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1049.839087] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Acquired lock "refresh_cache-8017964a-7fe8-40eb-a79d-47e0401a27d1" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1049.839260] env[65918]: DEBUG nova.network.neutron [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1049.874180] env[65918]: DEBUG nova.network.neutron [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1050.039934] env[65918]: DEBUG nova.network.neutron [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Updating instance_info_cache with network_info: [{"id": "e887b448-d7f4-4ed0-90d8-5d8516e6ca78", "address": "fa:16:3e:aa:62:c4", "network": {"id": "af336a25-a885-4591-9aa5-accabdc818a2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-405838915-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35faa253c2a243e2851f147c1e0ee4d5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "669e4919-e0ad-4e23-9f23-4c5f2be0d858", "external-id": "nsx-vlan-transportzone-362", "segmentation_id": 362, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape887b448-d7", "ovs_interfaceid": "e887b448-d7f4-4ed0-90d8-5d8516e6ca78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1050.050574] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Releasing lock "refresh_cache-8017964a-7fe8-40eb-a79d-47e0401a27d1" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1050.050856] env[65918]: DEBUG nova.compute.manager [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Instance network_info: |[{"id": "e887b448-d7f4-4ed0-90d8-5d8516e6ca78", "address": "fa:16:3e:aa:62:c4", "network": {"id": "af336a25-a885-4591-9aa5-accabdc818a2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-405838915-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35faa253c2a243e2851f147c1e0ee4d5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "669e4919-e0ad-4e23-9f23-4c5f2be0d858", "external-id": "nsx-vlan-transportzone-362", "segmentation_id": 362, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape887b448-d7", "ovs_interfaceid": "e887b448-d7f4-4ed0-90d8-5d8516e6ca78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1050.051315] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:aa:62:c4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '669e4919-e0ad-4e23-9f23-4c5f2be0d858', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e887b448-d7f4-4ed0-90d8-5d8516e6ca78', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1050.058983] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Creating folder: Project (35faa253c2a243e2851f147c1e0ee4d5). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1050.059486] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9fe85bc8-af31-49b0-b37d-b593f60a96e7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1050.070398] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Created folder: Project (35faa253c2a243e2851f147c1e0ee4d5) in parent group-v572679. [ 1050.070574] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Creating folder: Instances. Parent ref: group-v572736. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1050.070787] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d79c0235-733c-4f00-bd0d-30eebc98f5e8 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1050.079213] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Created folder: Instances in parent group-v572736. [ 1050.079436] env[65918]: DEBUG oslo.service.loopingcall [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1050.079629] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1050.079812] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b39d076e-53df-4d2a-a4d9-49089dfcbaff {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1050.097666] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1050.097666] env[65918]: value = "task-2848224" [ 1050.097666] env[65918]: _type = "Task" [ 1050.097666] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1050.104797] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848224, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1050.607160] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848224, 'name': CreateVM_Task, 'duration_secs': 0.319775} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1050.607371] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1050.607963] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1050.608141] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1050.608467] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1050.608745] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e881a115-d52e-4fcc-8e4e-03b198d6fbef {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1050.613204] env[65918]: DEBUG oslo_vmware.api [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Waiting for the task: (returnval){ [ 1050.613204] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52ad458a-cb79-9b53-a89b-ee7d8b8516bb" [ 1050.613204] env[65918]: _type = "Task" [ 1050.613204] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1050.621675] env[65918]: DEBUG oslo_vmware.api [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52ad458a-cb79-9b53-a89b-ee7d8b8516bb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1050.760347] env[65918]: DEBUG nova.compute.manager [req-70f8ef15-3c2d-4cbf-97b8-474276726271 req-dc3b97a6-d610-4a0e-9dd8-35849c80a4ff service nova] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Received event network-vif-plugged-e887b448-d7f4-4ed0-90d8-5d8516e6ca78 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1050.760511] env[65918]: DEBUG oslo_concurrency.lockutils [req-70f8ef15-3c2d-4cbf-97b8-474276726271 req-dc3b97a6-d610-4a0e-9dd8-35849c80a4ff service nova] Acquiring lock "8017964a-7fe8-40eb-a79d-47e0401a27d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1050.760718] env[65918]: DEBUG oslo_concurrency.lockutils [req-70f8ef15-3c2d-4cbf-97b8-474276726271 req-dc3b97a6-d610-4a0e-9dd8-35849c80a4ff service nova] Lock "8017964a-7fe8-40eb-a79d-47e0401a27d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1050.760893] env[65918]: DEBUG oslo_concurrency.lockutils [req-70f8ef15-3c2d-4cbf-97b8-474276726271 req-dc3b97a6-d610-4a0e-9dd8-35849c80a4ff service nova] Lock "8017964a-7fe8-40eb-a79d-47e0401a27d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1050.761200] env[65918]: DEBUG nova.compute.manager [req-70f8ef15-3c2d-4cbf-97b8-474276726271 req-dc3b97a6-d610-4a0e-9dd8-35849c80a4ff service nova] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] No waiting events found dispatching network-vif-plugged-e887b448-d7f4-4ed0-90d8-5d8516e6ca78 {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1050.761424] env[65918]: WARNING nova.compute.manager [req-70f8ef15-3c2d-4cbf-97b8-474276726271 req-dc3b97a6-d610-4a0e-9dd8-35849c80a4ff service nova] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Received unexpected event network-vif-plugged-e887b448-d7f4-4ed0-90d8-5d8516e6ca78 for instance with vm_state building and task_state spawning. [ 1050.761594] env[65918]: DEBUG nova.compute.manager [req-70f8ef15-3c2d-4cbf-97b8-474276726271 req-dc3b97a6-d610-4a0e-9dd8-35849c80a4ff service nova] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Received event network-changed-e887b448-d7f4-4ed0-90d8-5d8516e6ca78 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1050.761750] env[65918]: DEBUG nova.compute.manager [req-70f8ef15-3c2d-4cbf-97b8-474276726271 req-dc3b97a6-d610-4a0e-9dd8-35849c80a4ff service nova] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Refreshing instance network info cache due to event network-changed-e887b448-d7f4-4ed0-90d8-5d8516e6ca78. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1050.761931] env[65918]: DEBUG oslo_concurrency.lockutils [req-70f8ef15-3c2d-4cbf-97b8-474276726271 req-dc3b97a6-d610-4a0e-9dd8-35849c80a4ff service nova] Acquiring lock "refresh_cache-8017964a-7fe8-40eb-a79d-47e0401a27d1" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1050.762077] env[65918]: DEBUG oslo_concurrency.lockutils [req-70f8ef15-3c2d-4cbf-97b8-474276726271 req-dc3b97a6-d610-4a0e-9dd8-35849c80a4ff service nova] Acquired lock "refresh_cache-8017964a-7fe8-40eb-a79d-47e0401a27d1" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1050.762233] env[65918]: DEBUG nova.network.neutron [req-70f8ef15-3c2d-4cbf-97b8-474276726271 req-dc3b97a6-d610-4a0e-9dd8-35849c80a4ff service nova] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Refreshing network info cache for port e887b448-d7f4-4ed0-90d8-5d8516e6ca78 {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1050.990796] env[65918]: DEBUG nova.network.neutron [req-70f8ef15-3c2d-4cbf-97b8-474276726271 req-dc3b97a6-d610-4a0e-9dd8-35849c80a4ff service nova] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Updated VIF entry in instance network info cache for port e887b448-d7f4-4ed0-90d8-5d8516e6ca78. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1050.991164] env[65918]: DEBUG nova.network.neutron [req-70f8ef15-3c2d-4cbf-97b8-474276726271 req-dc3b97a6-d610-4a0e-9dd8-35849c80a4ff service nova] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Updating instance_info_cache with network_info: [{"id": "e887b448-d7f4-4ed0-90d8-5d8516e6ca78", "address": "fa:16:3e:aa:62:c4", "network": {"id": "af336a25-a885-4591-9aa5-accabdc818a2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-405838915-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35faa253c2a243e2851f147c1e0ee4d5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "669e4919-e0ad-4e23-9f23-4c5f2be0d858", "external-id": "nsx-vlan-transportzone-362", "segmentation_id": 362, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape887b448-d7", "ovs_interfaceid": "e887b448-d7f4-4ed0-90d8-5d8516e6ca78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1050.999904] env[65918]: DEBUG oslo_concurrency.lockutils [req-70f8ef15-3c2d-4cbf-97b8-474276726271 req-dc3b97a6-d610-4a0e-9dd8-35849c80a4ff service nova] Releasing lock "refresh_cache-8017964a-7fe8-40eb-a79d-47e0401a27d1" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1051.124149] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1051.124747] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1051.124747] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33b08c2a-a672-4ec7-906f-f008b89ba893 tempest-AttachVolumeShelveTestJSON-1762481057 tempest-AttachVolumeShelveTestJSON-1762481057-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1054.424840] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1055.419350] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1055.422925] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1055.423096] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Starting heal instance info cache {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1055.423217] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Rebuilding the list of instances to heal {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1055.436422] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1055.436716] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1055.436751] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1055.436863] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1055.436988] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Didn't find any instances for network info cache update. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1056.423820] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1056.424019] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65918) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1057.424158] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager.update_available_resource {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1057.433766] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1057.433988] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1057.434182] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1057.434341] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65918) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1057.435398] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01507aa9-243b-4cda-9841-1a469bfc58d4 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.444405] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b6739d7-18f8-497c-9ddf-3ea9321d322a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.458167] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba95a08e-8b52-416b-b694-f6b9de45e4e0 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.464261] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b934ec78-80a4-45ca-9b2f-f763b981a9a6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.493055] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181077MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65918) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1057.493055] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1057.493195] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1057.538977] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 51163f89-c8b6-48a8-bbbe-de63c44d92a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1057.539154] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bba6f3d9-1be3-4048-86d5-f435511b0fc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1057.539288] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 3faaaacf-815e-4493-81a7-2a32f868442a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1057.539416] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 8017964a-7fe8-40eb-a79d-47e0401a27d1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1057.549246] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance a0f06a58-65d2-4325-8f93-0948b4e5ac8c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1057.558763] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 81fef129-8f9a-4a19-afc0-f27411c36159 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1057.558959] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1057.559116] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1057.635669] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb123e92-de92-4420-9af0-40135d6892cc {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.642864] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26adbf5a-f352-4f87-9e98-ea9776fb8ade {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.672390] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aecc11a6-e8eb-435e-9652-0ff85d0d8668 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.679088] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e81d8d5-6004-4234-9271-354339ef8b5c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.691971] env[65918]: DEBUG nova.compute.provider_tree [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1057.701229] env[65918]: DEBUG nova.scheduler.client.report [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1057.714348] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65918) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1057.714556] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1058.714519] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1058.714814] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1058.714907] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1060.419494] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1060.434195] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1096.909985] env[65918]: WARNING oslo_vmware.rw_handles [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1096.909985] env[65918]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1096.909985] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1096.909985] env[65918]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1096.909985] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1096.909985] env[65918]: ERROR oslo_vmware.rw_handles response.begin() [ 1096.909985] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1096.909985] env[65918]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1096.909985] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1096.909985] env[65918]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1096.909985] env[65918]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1096.909985] env[65918]: ERROR oslo_vmware.rw_handles [ 1096.910747] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Downloaded image file data e017c336-3a02-4b58-874a-44a1d1e154fd to vmware_temp/cee9d159-3f75-4c70-b48d-d1b223c84809/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1096.912082] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Caching image {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1096.912336] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Copying Virtual Disk [datastore1] vmware_temp/cee9d159-3f75-4c70-b48d-d1b223c84809/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk to [datastore1] vmware_temp/cee9d159-3f75-4c70-b48d-d1b223c84809/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk {{(pid=65918) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1096.912619] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d684d29c-a10e-4568-a11e-8c222e78b0de {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.921948] env[65918]: DEBUG oslo_vmware.api [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Waiting for the task: (returnval){ [ 1096.921948] env[65918]: value = "task-2848225" [ 1096.921948] env[65918]: _type = "Task" [ 1096.921948] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1096.929628] env[65918]: DEBUG oslo_vmware.api [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Task: {'id': task-2848225, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1097.432366] env[65918]: DEBUG oslo_vmware.exceptions [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Fault InvalidArgument not matched. {{(pid=65918) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1097.432607] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1097.433154] env[65918]: ERROR nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1097.433154] env[65918]: Faults: ['InvalidArgument'] [ 1097.433154] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Traceback (most recent call last): [ 1097.433154] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1097.433154] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] yield resources [ 1097.433154] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1097.433154] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] self.driver.spawn(context, instance, image_meta, [ 1097.433154] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1097.433154] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1097.433154] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1097.433154] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] self._fetch_image_if_missing(context, vi) [ 1097.433154] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1097.433501] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] image_cache(vi, tmp_image_ds_loc) [ 1097.433501] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1097.433501] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] vm_util.copy_virtual_disk( [ 1097.433501] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1097.433501] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] session._wait_for_task(vmdk_copy_task) [ 1097.433501] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1097.433501] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] return self.wait_for_task(task_ref) [ 1097.433501] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1097.433501] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] return evt.wait() [ 1097.433501] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1097.433501] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] result = hub.switch() [ 1097.433501] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1097.433501] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] return self.greenlet.switch() [ 1097.433881] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1097.433881] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] self.f(*self.args, **self.kw) [ 1097.433881] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1097.433881] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] raise exceptions.translate_fault(task_info.error) [ 1097.433881] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1097.433881] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Faults: ['InvalidArgument'] [ 1097.433881] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] [ 1097.433881] env[65918]: INFO nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Terminating instance [ 1097.434973] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1097.435191] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1097.435415] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-003df2b7-9a6e-4568-a689-3d092ad7803e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.438681] env[65918]: DEBUG nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1097.438869] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1097.439584] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47118b59-29ea-42cf-965d-c9fc5ffa3230 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.445996] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1097.446206] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ab3187b4-60fe-4627-b361-f9fb2bcb0ff1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.448259] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1097.448426] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1097.449314] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ee9e5740-67b6-4098-88f7-53c45e5064ec {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.453730] env[65918]: DEBUG oslo_vmware.api [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Waiting for the task: (returnval){ [ 1097.453730] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52a7a150-ecf5-6708-4de4-d4e9fcf518ad" [ 1097.453730] env[65918]: _type = "Task" [ 1097.453730] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1097.460888] env[65918]: DEBUG oslo_vmware.api [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52a7a150-ecf5-6708-4de4-d4e9fcf518ad, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1097.512970] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1097.513217] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1097.513550] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Deleting the datastore file [datastore1] 51163f89-c8b6-48a8-bbbe-de63c44d92a5 {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1097.513892] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-738e8093-998f-4f81-88c6-a0955ca5c60f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.520277] env[65918]: DEBUG oslo_vmware.api [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Waiting for the task: (returnval){ [ 1097.520277] env[65918]: value = "task-2848227" [ 1097.520277] env[65918]: _type = "Task" [ 1097.520277] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1097.527420] env[65918]: DEBUG oslo_vmware.api [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Task: {'id': task-2848227, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1097.964153] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1097.964542] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Creating directory with path [datastore1] vmware_temp/c629e49c-430c-4636-95e5-b225a056ad94/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1097.964654] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fc2c2e66-f6de-418e-af30-d21e247be165 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.975629] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Created directory with path [datastore1] vmware_temp/c629e49c-430c-4636-95e5-b225a056ad94/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1097.975813] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Fetch image to [datastore1] vmware_temp/c629e49c-430c-4636-95e5-b225a056ad94/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1097.975979] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/c629e49c-430c-4636-95e5-b225a056ad94/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1097.976702] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee17dac9-61a5-49ab-b201-6510bb4882e5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.982970] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bbb309e-dc66-4385-8601-7a6fdd82a12d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.991670] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d7434bd-c948-4dfe-a83e-1c8efa20fc9e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.026220] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e450691d-71d6-48c1-b9fa-f2a4be2e66bf {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.034269] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c0936a40-120a-4d17-9d03-a8fcce8dfcb1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.035890] env[65918]: DEBUG oslo_vmware.api [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Task: {'id': task-2848227, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080276} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1098.036132] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1098.036315] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1098.036486] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1098.036653] env[65918]: INFO nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1098.038710] env[65918]: DEBUG nova.compute.claims [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1098.038883] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1098.039109] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1098.060488] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1098.109211] env[65918]: DEBUG oslo_vmware.rw_handles [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c629e49c-430c-4636-95e5-b225a056ad94/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1098.167609] env[65918]: DEBUG oslo_vmware.rw_handles [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Completed reading data from the image iterator. {{(pid=65918) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1098.167795] env[65918]: DEBUG oslo_vmware.rw_handles [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c629e49c-430c-4636-95e5-b225a056ad94/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1098.207176] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43fd10eb-6568-43c6-b94f-e57e93b061e6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.214590] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53e19011-57db-4920-af65-cb568704c118 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.243796] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c305010-fb69-43ba-a7e9-3a48b20b7026 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.250810] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a81c39c-a149-4392-b7cb-fa54cb84dd5b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.263866] env[65918]: DEBUG nova.compute.provider_tree [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1098.272287] env[65918]: DEBUG nova.scheduler.client.report [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1098.284883] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.246s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1098.285403] env[65918]: ERROR nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1098.285403] env[65918]: Faults: ['InvalidArgument'] [ 1098.285403] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Traceback (most recent call last): [ 1098.285403] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1098.285403] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] self.driver.spawn(context, instance, image_meta, [ 1098.285403] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1098.285403] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1098.285403] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1098.285403] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] self._fetch_image_if_missing(context, vi) [ 1098.285403] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1098.285403] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] image_cache(vi, tmp_image_ds_loc) [ 1098.285403] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1098.285770] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] vm_util.copy_virtual_disk( [ 1098.285770] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1098.285770] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] session._wait_for_task(vmdk_copy_task) [ 1098.285770] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1098.285770] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] return self.wait_for_task(task_ref) [ 1098.285770] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1098.285770] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] return evt.wait() [ 1098.285770] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1098.285770] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] result = hub.switch() [ 1098.285770] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1098.285770] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] return self.greenlet.switch() [ 1098.285770] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1098.285770] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] self.f(*self.args, **self.kw) [ 1098.286164] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1098.286164] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] raise exceptions.translate_fault(task_info.error) [ 1098.286164] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1098.286164] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Faults: ['InvalidArgument'] [ 1098.286164] env[65918]: ERROR nova.compute.manager [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] [ 1098.286164] env[65918]: DEBUG nova.compute.utils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] VimFaultException {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1098.287647] env[65918]: DEBUG nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Build of instance 51163f89-c8b6-48a8-bbbe-de63c44d92a5 was re-scheduled: A specified parameter was not correct: fileType [ 1098.287647] env[65918]: Faults: ['InvalidArgument'] {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1098.288033] env[65918]: DEBUG nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1098.288211] env[65918]: DEBUG nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1098.288393] env[65918]: DEBUG nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1098.288561] env[65918]: DEBUG nova.network.neutron [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1098.594444] env[65918]: DEBUG nova.network.neutron [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1098.605965] env[65918]: INFO nova.compute.manager [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Took 0.32 seconds to deallocate network for instance. [ 1098.691025] env[65918]: INFO nova.scheduler.client.report [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Deleted allocations for instance 51163f89-c8b6-48a8-bbbe-de63c44d92a5 [ 1098.706999] env[65918]: DEBUG oslo_concurrency.lockutils [None req-927f9b10-6597-410c-a196-6e10f1e1379a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Lock "51163f89-c8b6-48a8-bbbe-de63c44d92a5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 468.056s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1098.708081] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5c76c5ee-e848-4309-8d04-bdef433be87a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Lock "51163f89-c8b6-48a8-bbbe-de63c44d92a5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 268.328s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1098.708765] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5c76c5ee-e848-4309-8d04-bdef433be87a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Acquiring lock "51163f89-c8b6-48a8-bbbe-de63c44d92a5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1098.708765] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5c76c5ee-e848-4309-8d04-bdef433be87a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Lock "51163f89-c8b6-48a8-bbbe-de63c44d92a5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1098.708765] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5c76c5ee-e848-4309-8d04-bdef433be87a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Lock "51163f89-c8b6-48a8-bbbe-de63c44d92a5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1098.710817] env[65918]: INFO nova.compute.manager [None req-5c76c5ee-e848-4309-8d04-bdef433be87a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Terminating instance [ 1098.712188] env[65918]: DEBUG nova.compute.manager [None req-5c76c5ee-e848-4309-8d04-bdef433be87a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1098.713301] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-5c76c5ee-e848-4309-8d04-bdef433be87a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1098.713301] env[65918]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a601a3ef-5eaf-4a6c-9014-0c85200239ae {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.721436] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00836ff2-f82d-40bd-9a00-aad9bf673d12 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.731991] env[65918]: DEBUG nova.compute.manager [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1098.751972] env[65918]: WARNING nova.virt.vmwareapi.vmops [None req-5c76c5ee-e848-4309-8d04-bdef433be87a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 51163f89-c8b6-48a8-bbbe-de63c44d92a5 could not be found. [ 1098.752189] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-5c76c5ee-e848-4309-8d04-bdef433be87a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1098.752366] env[65918]: INFO nova.compute.manager [None req-5c76c5ee-e848-4309-8d04-bdef433be87a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1098.752597] env[65918]: DEBUG oslo.service.loopingcall [None req-5c76c5ee-e848-4309-8d04-bdef433be87a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1098.752832] env[65918]: DEBUG nova.compute.manager [-] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1098.752938] env[65918]: DEBUG nova.network.neutron [-] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1098.777792] env[65918]: DEBUG nova.network.neutron [-] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1098.785354] env[65918]: INFO nova.compute.manager [-] [instance: 51163f89-c8b6-48a8-bbbe-de63c44d92a5] Took 0.03 seconds to deallocate network for instance. [ 1098.787425] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1098.787655] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1098.789197] env[65918]: INFO nova.compute.claims [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1098.869822] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5c76c5ee-e848-4309-8d04-bdef433be87a tempest-ServerDiagnosticsTest-1377592396 tempest-ServerDiagnosticsTest-1377592396-project-member] Lock "51163f89-c8b6-48a8-bbbe-de63c44d92a5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.162s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1098.898638] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2040cdc3-cca6-4e86-ae53-41f17b2d08c5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.905644] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb3a9033-a94d-47aa-84bf-3689f3e18656 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.936074] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a700fc9-db04-492a-9e82-affaeb0641aa {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.944252] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7cd438f-4001-42bb-8b18-4697ddc63077 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.958171] env[65918]: DEBUG nova.compute.provider_tree [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1098.966659] env[65918]: DEBUG nova.scheduler.client.report [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1098.978820] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.191s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1098.979255] env[65918]: DEBUG nova.compute.manager [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1099.008336] env[65918]: DEBUG nova.compute.utils [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1099.009580] env[65918]: DEBUG nova.compute.manager [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1099.009746] env[65918]: DEBUG nova.network.neutron [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1099.017890] env[65918]: DEBUG nova.compute.manager [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1099.063867] env[65918]: DEBUG nova.policy [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6ef68af888964f33930792acd2eccfe7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '91e03514a943477fb514c386edf9ec15', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 1099.080988] env[65918]: DEBUG nova.compute.manager [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1099.103444] env[65918]: DEBUG nova.virt.hardware [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1099.103686] env[65918]: DEBUG nova.virt.hardware [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1099.103841] env[65918]: DEBUG nova.virt.hardware [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1099.104029] env[65918]: DEBUG nova.virt.hardware [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1099.104579] env[65918]: DEBUG nova.virt.hardware [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1099.104579] env[65918]: DEBUG nova.virt.hardware [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1099.104579] env[65918]: DEBUG nova.virt.hardware [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1099.104792] env[65918]: DEBUG nova.virt.hardware [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1099.104792] env[65918]: DEBUG nova.virt.hardware [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1099.104954] env[65918]: DEBUG nova.virt.hardware [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1099.105139] env[65918]: DEBUG nova.virt.hardware [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1099.105979] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e9cf371-b5a3-4c7a-8d3f-6b8f20e6c5e8 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.113655] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a0e623c-76b6-4de4-9f17-3fab2538a19d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.418637] env[65918]: DEBUG nova.network.neutron [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Successfully created port: 3308606a-7d78-4c8a-9933-53d19b489b59 {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1099.931609] env[65918]: DEBUG nova.network.neutron [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Successfully updated port: 3308606a-7d78-4c8a-9933-53d19b489b59 {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1099.942134] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Acquiring lock "refresh_cache-a0f06a58-65d2-4325-8f93-0948b4e5ac8c" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1099.942134] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Acquired lock "refresh_cache-a0f06a58-65d2-4325-8f93-0948b4e5ac8c" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1099.942134] env[65918]: DEBUG nova.network.neutron [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1099.976863] env[65918]: DEBUG nova.network.neutron [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1100.123562] env[65918]: DEBUG nova.network.neutron [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Updating instance_info_cache with network_info: [{"id": "3308606a-7d78-4c8a-9933-53d19b489b59", "address": "fa:16:3e:7d:d0:2d", "network": {"id": "409d23af-ccaa-4f8f-8af9-0be9459eb1f0", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-457439374-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "91e03514a943477fb514c386edf9ec15", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "61b8f0db-488e-42d7-bf6c-6c1665cd5616", "external-id": "nsx-vlan-transportzone-655", "segmentation_id": 655, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3308606a-7d", "ovs_interfaceid": "3308606a-7d78-4c8a-9933-53d19b489b59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1100.137575] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Releasing lock "refresh_cache-a0f06a58-65d2-4325-8f93-0948b4e5ac8c" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1100.137575] env[65918]: DEBUG nova.compute.manager [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Instance network_info: |[{"id": "3308606a-7d78-4c8a-9933-53d19b489b59", "address": "fa:16:3e:7d:d0:2d", "network": {"id": "409d23af-ccaa-4f8f-8af9-0be9459eb1f0", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-457439374-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "91e03514a943477fb514c386edf9ec15", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "61b8f0db-488e-42d7-bf6c-6c1665cd5616", "external-id": "nsx-vlan-transportzone-655", "segmentation_id": 655, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3308606a-7d", "ovs_interfaceid": "3308606a-7d78-4c8a-9933-53d19b489b59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1100.137821] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7d:d0:2d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '61b8f0db-488e-42d7-bf6c-6c1665cd5616', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3308606a-7d78-4c8a-9933-53d19b489b59', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1100.145395] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Creating folder: Project (91e03514a943477fb514c386edf9ec15). Parent ref: group-v572679. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1100.145880] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e16e5ae3-dace-4c69-a565-ea9285281f64 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.158320] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Created folder: Project (91e03514a943477fb514c386edf9ec15) in parent group-v572679. [ 1100.158506] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Creating folder: Instances. Parent ref: group-v572739. {{(pid=65918) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1100.158717] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-951c717b-ce73-4f3f-aca8-a43fbc06ab8e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.168599] env[65918]: INFO nova.virt.vmwareapi.vm_util [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Created folder: Instances in parent group-v572739. [ 1100.168810] env[65918]: DEBUG oslo.service.loopingcall [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1100.168978] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1100.169171] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fbcb495c-ae80-4059-b340-0466e891fc94 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.186838] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1100.186838] env[65918]: value = "task-2848230" [ 1100.186838] env[65918]: _type = "Task" [ 1100.186838] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1100.193882] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848230, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1100.631673] env[65918]: DEBUG nova.compute.manager [req-5e1349dc-3764-49a2-85a2-f2425aff9b03 req-55d8db74-ff91-4bae-9051-8ba033bdffc2 service nova] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Received event network-vif-plugged-3308606a-7d78-4c8a-9933-53d19b489b59 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1100.631890] env[65918]: DEBUG oslo_concurrency.lockutils [req-5e1349dc-3764-49a2-85a2-f2425aff9b03 req-55d8db74-ff91-4bae-9051-8ba033bdffc2 service nova] Acquiring lock "a0f06a58-65d2-4325-8f93-0948b4e5ac8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1100.632131] env[65918]: DEBUG oslo_concurrency.lockutils [req-5e1349dc-3764-49a2-85a2-f2425aff9b03 req-55d8db74-ff91-4bae-9051-8ba033bdffc2 service nova] Lock "a0f06a58-65d2-4325-8f93-0948b4e5ac8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1100.632304] env[65918]: DEBUG oslo_concurrency.lockutils [req-5e1349dc-3764-49a2-85a2-f2425aff9b03 req-55d8db74-ff91-4bae-9051-8ba033bdffc2 service nova] Lock "a0f06a58-65d2-4325-8f93-0948b4e5ac8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1100.632475] env[65918]: DEBUG nova.compute.manager [req-5e1349dc-3764-49a2-85a2-f2425aff9b03 req-55d8db74-ff91-4bae-9051-8ba033bdffc2 service nova] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] No waiting events found dispatching network-vif-plugged-3308606a-7d78-4c8a-9933-53d19b489b59 {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1100.632638] env[65918]: WARNING nova.compute.manager [req-5e1349dc-3764-49a2-85a2-f2425aff9b03 req-55d8db74-ff91-4bae-9051-8ba033bdffc2 service nova] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Received unexpected event network-vif-plugged-3308606a-7d78-4c8a-9933-53d19b489b59 for instance with vm_state building and task_state spawning. [ 1100.632798] env[65918]: DEBUG nova.compute.manager [req-5e1349dc-3764-49a2-85a2-f2425aff9b03 req-55d8db74-ff91-4bae-9051-8ba033bdffc2 service nova] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Received event network-changed-3308606a-7d78-4c8a-9933-53d19b489b59 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1100.632948] env[65918]: DEBUG nova.compute.manager [req-5e1349dc-3764-49a2-85a2-f2425aff9b03 req-55d8db74-ff91-4bae-9051-8ba033bdffc2 service nova] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Refreshing instance network info cache due to event network-changed-3308606a-7d78-4c8a-9933-53d19b489b59. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1100.633326] env[65918]: DEBUG oslo_concurrency.lockutils [req-5e1349dc-3764-49a2-85a2-f2425aff9b03 req-55d8db74-ff91-4bae-9051-8ba033bdffc2 service nova] Acquiring lock "refresh_cache-a0f06a58-65d2-4325-8f93-0948b4e5ac8c" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1100.633487] env[65918]: DEBUG oslo_concurrency.lockutils [req-5e1349dc-3764-49a2-85a2-f2425aff9b03 req-55d8db74-ff91-4bae-9051-8ba033bdffc2 service nova] Acquired lock "refresh_cache-a0f06a58-65d2-4325-8f93-0948b4e5ac8c" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1100.633652] env[65918]: DEBUG nova.network.neutron [req-5e1349dc-3764-49a2-85a2-f2425aff9b03 req-55d8db74-ff91-4bae-9051-8ba033bdffc2 service nova] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Refreshing network info cache for port 3308606a-7d78-4c8a-9933-53d19b489b59 {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1100.696807] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848230, 'name': CreateVM_Task, 'duration_secs': 0.299071} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1100.696944] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1100.697675] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1100.697837] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1100.698229] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1100.698490] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8516ac52-3bb4-4705-80fb-b99b4f69812a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.702737] env[65918]: DEBUG oslo_vmware.api [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Waiting for the task: (returnval){ [ 1100.702737] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]521077f3-af2f-a021-9613-29ab0cc23d76" [ 1100.702737] env[65918]: _type = "Task" [ 1100.702737] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1100.710409] env[65918]: DEBUG oslo_vmware.api [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]521077f3-af2f-a021-9613-29ab0cc23d76, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1101.096039] env[65918]: DEBUG nova.network.neutron [req-5e1349dc-3764-49a2-85a2-f2425aff9b03 req-55d8db74-ff91-4bae-9051-8ba033bdffc2 service nova] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Updated VIF entry in instance network info cache for port 3308606a-7d78-4c8a-9933-53d19b489b59. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1101.096427] env[65918]: DEBUG nova.network.neutron [req-5e1349dc-3764-49a2-85a2-f2425aff9b03 req-55d8db74-ff91-4bae-9051-8ba033bdffc2 service nova] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Updating instance_info_cache with network_info: [{"id": "3308606a-7d78-4c8a-9933-53d19b489b59", "address": "fa:16:3e:7d:d0:2d", "network": {"id": "409d23af-ccaa-4f8f-8af9-0be9459eb1f0", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-457439374-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "91e03514a943477fb514c386edf9ec15", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "61b8f0db-488e-42d7-bf6c-6c1665cd5616", "external-id": "nsx-vlan-transportzone-655", "segmentation_id": 655, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3308606a-7d", "ovs_interfaceid": "3308606a-7d78-4c8a-9933-53d19b489b59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1101.105663] env[65918]: DEBUG oslo_concurrency.lockutils [req-5e1349dc-3764-49a2-85a2-f2425aff9b03 req-55d8db74-ff91-4bae-9051-8ba033bdffc2 service nova] Releasing lock "refresh_cache-a0f06a58-65d2-4325-8f93-0948b4e5ac8c" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1101.212670] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1101.212916] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1101.213139] env[65918]: DEBUG oslo_concurrency.lockutils [None req-5bfef4da-d1f4-47fc-a19c-4d4d492d3ff5 tempest-ServerActionsTestOtherA-1867635188 tempest-ServerActionsTestOtherA-1867635188-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1114.422932] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1115.419828] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1116.424673] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1116.425109] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65918) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1117.424473] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1117.424675] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Starting heal instance info cache {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1117.424801] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Rebuilding the list of instances to heal {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1117.439786] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1117.439936] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1117.440079] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1117.440211] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Skipping network cache update for instance because it is Building. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1117.440378] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Didn't find any instances for network info cache update. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1117.440809] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager.update_available_resource {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1117.449591] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1117.449789] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1117.449950] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1117.450117] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65918) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1117.451125] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7db9b063-6055-4379-9652-8d63e3d2452f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1117.459701] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bc27085-6055-4757-9453-32176e84c3ad {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1117.473315] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00d04e63-0b2e-457b-86ee-1693485dfa81 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1117.479267] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21c1a975-5736-4a84-8d4c-53ee4a0566bc {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1117.509015] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181045MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65918) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1117.509174] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1117.509351] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1117.552810] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance bba6f3d9-1be3-4048-86d5-f435511b0fc0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1117.552960] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 3faaaacf-815e-4493-81a7-2a32f868442a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1117.553099] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 8017964a-7fe8-40eb-a79d-47e0401a27d1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1117.553222] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance a0f06a58-65d2-4325-8f93-0948b4e5ac8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1117.562978] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Instance 81fef129-8f9a-4a19-afc0-f27411c36159 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=65918) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1117.563194] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1117.563343] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1117.627743] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9fa4060-44d2-42f9-8384-6d29a6aad795 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1117.635155] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d38c190-c7de-4b32-8468-2637c7b9178f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1117.664077] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82bcf055-a1a4-4ef5-935e-8765c7327ac3 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1117.670785] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b2881f3-54b9-4f6c-a0aa-c4063a9574c1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1117.683301] env[65918]: DEBUG nova.compute.provider_tree [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1117.691425] env[65918]: DEBUG nova.scheduler.client.report [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1117.703777] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65918) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1117.704019] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.195s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1118.686692] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1119.423095] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1120.423571] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1121.424381] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1137.179071] env[65918]: DEBUG nova.compute.manager [req-df27ff9a-b67a-4456-9557-1094600772de req-0bd3357c-b2d5-4873-ac22-190657c8f6d8 service nova] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Received event network-vif-deleted-7486ddef-cbd9-4389-b4ab-91605f65fdc1 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1139.206598] env[65918]: DEBUG nova.compute.manager [req-75fbe765-a732-4535-bd74-a276f19ba971 req-65add9b2-f3ba-40ef-a1f7-d116747e0b67 service nova] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Received event network-vif-deleted-e887b448-d7f4-4ed0-90d8-5d8516e6ca78 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1142.621160] env[65918]: DEBUG nova.compute.manager [req-fbcad970-bc1e-4465-ac73-ea2f1794ea69 req-df775b93-f75c-4b19-aefb-9074b4ff5e66 service nova] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Received event network-vif-deleted-3308606a-7d78-4c8a-9933-53d19b489b59 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1143.263191] env[65918]: WARNING oslo_vmware.rw_handles [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1143.263191] env[65918]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1143.263191] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1143.263191] env[65918]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1143.263191] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1143.263191] env[65918]: ERROR oslo_vmware.rw_handles response.begin() [ 1143.263191] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1143.263191] env[65918]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1143.263191] env[65918]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1143.263191] env[65918]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1143.263191] env[65918]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1143.263191] env[65918]: ERROR oslo_vmware.rw_handles [ 1143.263898] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Downloaded image file data e017c336-3a02-4b58-874a-44a1d1e154fd to vmware_temp/c629e49c-430c-4636-95e5-b225a056ad94/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1143.265769] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Caching image {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1143.266081] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Copying Virtual Disk [datastore1] vmware_temp/c629e49c-430c-4636-95e5-b225a056ad94/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk to [datastore1] vmware_temp/c629e49c-430c-4636-95e5-b225a056ad94/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk {{(pid=65918) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1143.266417] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c5c2a6ee-4265-4b94-9fbc-83adbc7bdd6b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.275290] env[65918]: DEBUG oslo_vmware.api [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Waiting for the task: (returnval){ [ 1143.275290] env[65918]: value = "task-2848231" [ 1143.275290] env[65918]: _type = "Task" [ 1143.275290] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1143.282848] env[65918]: DEBUG oslo_vmware.api [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Task: {'id': task-2848231, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1143.784799] env[65918]: DEBUG oslo_vmware.exceptions [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Fault InvalidArgument not matched. {{(pid=65918) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1143.785071] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1143.785614] env[65918]: ERROR nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1143.785614] env[65918]: Faults: ['InvalidArgument'] [ 1143.785614] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Traceback (most recent call last): [ 1143.785614] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1143.785614] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] yield resources [ 1143.785614] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1143.785614] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] self.driver.spawn(context, instance, image_meta, [ 1143.785614] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1143.785614] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1143.785614] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1143.785614] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] self._fetch_image_if_missing(context, vi) [ 1143.785614] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1143.786153] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] image_cache(vi, tmp_image_ds_loc) [ 1143.786153] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1143.786153] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] vm_util.copy_virtual_disk( [ 1143.786153] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1143.786153] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] session._wait_for_task(vmdk_copy_task) [ 1143.786153] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1143.786153] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] return self.wait_for_task(task_ref) [ 1143.786153] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1143.786153] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] return evt.wait() [ 1143.786153] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1143.786153] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] result = hub.switch() [ 1143.786153] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1143.786153] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] return self.greenlet.switch() [ 1143.786493] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1143.786493] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] self.f(*self.args, **self.kw) [ 1143.786493] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1143.786493] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] raise exceptions.translate_fault(task_info.error) [ 1143.786493] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1143.786493] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Faults: ['InvalidArgument'] [ 1143.786493] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] [ 1143.786493] env[65918]: INFO nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Terminating instance [ 1143.787507] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1143.787707] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1143.788363] env[65918]: DEBUG nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1143.788552] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1143.788766] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f5f3d88f-b55c-4c80-a96f-4220e5134b41 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.791088] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb2b0601-23d1-4fcc-b72a-18b31acd09c5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.797789] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1143.798035] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cc476f14-b3f7-4d41-bbce-01e2568c7651 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.800244] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1143.800415] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1143.801352] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a6eb218c-8a83-432b-85fd-cf96e27bef08 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.805961] env[65918]: DEBUG oslo_vmware.api [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Waiting for the task: (returnval){ [ 1143.805961] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52c3f358-be34-fac8-5f23-dcebf890d248" [ 1143.805961] env[65918]: _type = "Task" [ 1143.805961] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1143.812906] env[65918]: DEBUG oslo_vmware.api [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52c3f358-be34-fac8-5f23-dcebf890d248, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1143.859676] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1143.859909] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1143.860028] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Deleting the datastore file [datastore1] bba6f3d9-1be3-4048-86d5-f435511b0fc0 {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1143.860288] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a758424c-3eaa-4b60-82ff-6510a4160b36 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.865856] env[65918]: DEBUG oslo_vmware.api [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Waiting for the task: (returnval){ [ 1143.865856] env[65918]: value = "task-2848233" [ 1143.865856] env[65918]: _type = "Task" [ 1143.865856] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1143.873209] env[65918]: DEBUG oslo_vmware.api [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Task: {'id': task-2848233, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1144.317353] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1144.317687] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Creating directory with path [datastore1] vmware_temp/7bf1ebc6-3d8e-4e2a-a8d5-5f8a3b3ea03c/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1144.317925] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-366d7276-a6c0-41d1-a753-7ac94ba5d3d6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.328940] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Created directory with path [datastore1] vmware_temp/7bf1ebc6-3d8e-4e2a-a8d5-5f8a3b3ea03c/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1144.329154] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Fetch image to [datastore1] vmware_temp/7bf1ebc6-3d8e-4e2a-a8d5-5f8a3b3ea03c/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1144.329360] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/7bf1ebc6-3d8e-4e2a-a8d5-5f8a3b3ea03c/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1144.330120] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4cfc26d-388e-4c69-8d34-80d7ed00d132 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.336785] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d0c8283-5ef5-4284-9bc2-111cc9db74ae {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.345875] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70c9eaae-4c10-4dce-bb72-738d15a6e652 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.380244] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10348bd8-9e76-4ea1-bc84-9a1961257507 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.387730] env[65918]: DEBUG oslo_vmware.api [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Task: {'id': task-2848233, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079052} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1144.389232] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1144.389422] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1144.389595] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1144.389766] env[65918]: INFO nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1144.391536] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-12834617-7267-484c-a81b-c2ee39c66e7e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.393412] env[65918]: DEBUG nova.compute.claims [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1144.393582] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1144.393799] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1144.415229] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1144.471143] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2edcf014-2706-4c5e-9b59-1624db3af1ed {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.478663] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74617e0e-89d1-4c8b-af4f-c18836f29d25 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.511000] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7afdddfa-dd27-4fea-be5a-1490fd34728d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.518594] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec483efa-76bd-480f-a234-1b95b1e36925 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.533500] env[65918]: DEBUG nova.compute.provider_tree [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1144.544405] env[65918]: DEBUG nova.scheduler.client.report [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1144.556099] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.162s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1144.556622] env[65918]: ERROR nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1144.556622] env[65918]: Faults: ['InvalidArgument'] [ 1144.556622] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Traceback (most recent call last): [ 1144.556622] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1144.556622] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] self.driver.spawn(context, instance, image_meta, [ 1144.556622] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1144.556622] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1144.556622] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1144.556622] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] self._fetch_image_if_missing(context, vi) [ 1144.556622] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1144.556622] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] image_cache(vi, tmp_image_ds_loc) [ 1144.556622] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1144.556977] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] vm_util.copy_virtual_disk( [ 1144.556977] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1144.556977] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] session._wait_for_task(vmdk_copy_task) [ 1144.556977] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1144.556977] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] return self.wait_for_task(task_ref) [ 1144.556977] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1144.556977] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] return evt.wait() [ 1144.556977] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1144.556977] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] result = hub.switch() [ 1144.556977] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1144.556977] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] return self.greenlet.switch() [ 1144.556977] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1144.556977] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] self.f(*self.args, **self.kw) [ 1144.557296] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1144.557296] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] raise exceptions.translate_fault(task_info.error) [ 1144.557296] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1144.557296] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Faults: ['InvalidArgument'] [ 1144.557296] env[65918]: ERROR nova.compute.manager [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] [ 1144.557296] env[65918]: DEBUG nova.compute.utils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] VimFaultException {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1144.559045] env[65918]: DEBUG nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Build of instance bba6f3d9-1be3-4048-86d5-f435511b0fc0 was re-scheduled: A specified parameter was not correct: fileType [ 1144.559045] env[65918]: Faults: ['InvalidArgument'] {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1144.559455] env[65918]: DEBUG nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1144.559627] env[65918]: DEBUG nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1144.559794] env[65918]: DEBUG nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1144.559953] env[65918]: DEBUG nova.network.neutron [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1144.657572] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1144.659120] env[65918]: ERROR nova.compute.manager [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1144.659120] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Traceback (most recent call last): [ 1144.659120] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1144.659120] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1144.659120] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1144.659120] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] result = getattr(controller, method)(*args, **kwargs) [ 1144.659120] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1144.659120] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self._get(image_id) [ 1144.659120] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1144.659120] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1144.659120] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1144.659469] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] resp, body = self.http_client.get(url, headers=header) [ 1144.659469] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1144.659469] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self.request(url, 'GET', **kwargs) [ 1144.659469] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1144.659469] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self._handle_response(resp) [ 1144.659469] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1144.659469] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] raise exc.from_response(resp, resp.content) [ 1144.659469] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1144.659469] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1144.659469] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] During handling of the above exception, another exception occurred: [ 1144.659469] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1144.659469] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Traceback (most recent call last): [ 1144.659817] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1144.659817] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] yield resources [ 1144.659817] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1144.659817] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] self.driver.spawn(context, instance, image_meta, [ 1144.659817] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1144.659817] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1144.659817] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1144.659817] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] self._fetch_image_if_missing(context, vi) [ 1144.659817] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1144.659817] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] image_fetch(context, vi, tmp_image_ds_loc) [ 1144.659817] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1144.659817] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] images.fetch_image( [ 1144.659817] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1144.660221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] metadata = IMAGE_API.get(context, image_ref) [ 1144.660221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1144.660221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return session.show(context, image_id, [ 1144.660221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1144.660221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] _reraise_translated_image_exception(image_id) [ 1144.660221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1144.660221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] raise new_exc.with_traceback(exc_trace) [ 1144.660221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1144.660221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1144.660221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1144.660221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] result = getattr(controller, method)(*args, **kwargs) [ 1144.660221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1144.660221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self._get(image_id) [ 1144.660598] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1144.660598] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1144.660598] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1144.660598] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] resp, body = self.http_client.get(url, headers=header) [ 1144.660598] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1144.660598] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self.request(url, 'GET', **kwargs) [ 1144.660598] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1144.660598] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self._handle_response(resp) [ 1144.660598] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1144.660598] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] raise exc.from_response(resp, resp.content) [ 1144.660598] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1144.660598] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1144.660930] env[65918]: INFO nova.compute.manager [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Terminating instance [ 1144.661763] env[65918]: DEBUG nova.compute.manager [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1144.661963] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1144.662293] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1144.662487] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1144.663283] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-597cef39-1264-424c-886f-aecf5f63009d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.665957] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f85a763b-2309-4fd5-b359-f700ce22d911 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.672513] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1144.672730] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ec5d7f5e-79ba-4789-9b43-b1a01720146a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.675213] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1144.675388] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1144.676317] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-08242405-6d2b-4b6c-b728-ba440b620dc6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.681611] env[65918]: DEBUG oslo_vmware.api [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Waiting for the task: (returnval){ [ 1144.681611] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52339e81-5b2d-1216-48f4-d0d6daceaf18" [ 1144.681611] env[65918]: _type = "Task" [ 1144.681611] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1144.693986] env[65918]: DEBUG oslo_vmware.api [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52339e81-5b2d-1216-48f4-d0d6daceaf18, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1144.734134] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1144.734360] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1144.734540] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Deleting the datastore file [datastore1] 3b3f8c10-5ba5-445c-a51d-5404874df3d9 {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1144.734798] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4ba1d9f6-7021-4abf-9771-660f2ba3f886 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.741359] env[65918]: DEBUG oslo_vmware.api [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Waiting for the task: (returnval){ [ 1144.741359] env[65918]: value = "task-2848235" [ 1144.741359] env[65918]: _type = "Task" [ 1144.741359] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1144.751311] env[65918]: DEBUG oslo_vmware.api [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Task: {'id': task-2848235, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1144.939029] env[65918]: DEBUG nova.network.neutron [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1144.949763] env[65918]: INFO nova.compute.manager [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Took 0.39 seconds to deallocate network for instance. [ 1145.065385] env[65918]: INFO nova.scheduler.client.report [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Deleted allocations for instance bba6f3d9-1be3-4048-86d5-f435511b0fc0 [ 1145.082010] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e3e50f1a-5c2f-4e55-b2b2-ae94cf821c12 tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Lock "bba6f3d9-1be3-4048-86d5-f435511b0fc0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 512.086s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1145.083615] env[65918]: DEBUG oslo_concurrency.lockutils [None req-fb2be1a6-8e8c-4f90-8667-a6240f41667f tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Lock "bba6f3d9-1be3-4048-86d5-f435511b0fc0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 312.955s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1145.083893] env[65918]: DEBUG oslo_concurrency.lockutils [None req-fb2be1a6-8e8c-4f90-8667-a6240f41667f tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Acquiring lock "bba6f3d9-1be3-4048-86d5-f435511b0fc0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1145.084127] env[65918]: DEBUG oslo_concurrency.lockutils [None req-fb2be1a6-8e8c-4f90-8667-a6240f41667f tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Lock "bba6f3d9-1be3-4048-86d5-f435511b0fc0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1145.084301] env[65918]: DEBUG oslo_concurrency.lockutils [None req-fb2be1a6-8e8c-4f90-8667-a6240f41667f tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Lock "bba6f3d9-1be3-4048-86d5-f435511b0fc0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1145.086066] env[65918]: INFO nova.compute.manager [None req-fb2be1a6-8e8c-4f90-8667-a6240f41667f tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Terminating instance [ 1145.087714] env[65918]: DEBUG nova.compute.manager [None req-fb2be1a6-8e8c-4f90-8667-a6240f41667f tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1145.087911] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-fb2be1a6-8e8c-4f90-8667-a6240f41667f tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1145.088426] env[65918]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4f7e25c6-b08c-4bc8-abef-f5d7d2736a09 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.096741] env[65918]: DEBUG nova.compute.manager [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Starting instance... {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1145.102479] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-348e100e-f4e4-49be-afc8-88e4e74e6c6b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.131490] env[65918]: WARNING nova.virt.vmwareapi.vmops [None req-fb2be1a6-8e8c-4f90-8667-a6240f41667f tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bba6f3d9-1be3-4048-86d5-f435511b0fc0 could not be found. [ 1145.131688] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-fb2be1a6-8e8c-4f90-8667-a6240f41667f tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1145.131863] env[65918]: INFO nova.compute.manager [None req-fb2be1a6-8e8c-4f90-8667-a6240f41667f tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1145.132110] env[65918]: DEBUG oslo.service.loopingcall [None req-fb2be1a6-8e8c-4f90-8667-a6240f41667f tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1145.136618] env[65918]: DEBUG nova.compute.manager [-] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1145.136711] env[65918]: DEBUG nova.network.neutron [-] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1145.148058] env[65918]: DEBUG oslo_concurrency.lockutils [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1145.148366] env[65918]: DEBUG oslo_concurrency.lockutils [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1145.149784] env[65918]: INFO nova.compute.claims [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1145.160843] env[65918]: DEBUG nova.network.neutron [-] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1145.169948] env[65918]: INFO nova.compute.manager [-] [instance: bba6f3d9-1be3-4048-86d5-f435511b0fc0] Took 0.03 seconds to deallocate network for instance. [ 1145.199829] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1145.200085] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Creating directory with path [datastore1] vmware_temp/b76abd91-dd23-44b4-baa6-732f54c76349/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1145.201060] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0c200722-0e92-4b13-84aa-9ae19497ee1d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.220255] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Created directory with path [datastore1] vmware_temp/b76abd91-dd23-44b4-baa6-732f54c76349/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1145.220606] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Fetch image to [datastore1] vmware_temp/b76abd91-dd23-44b4-baa6-732f54c76349/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1145.220695] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/b76abd91-dd23-44b4-baa6-732f54c76349/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1145.222050] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c7ede2a-5846-4bc7-8848-50e507f6f317 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.229862] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c49238b-ed95-4610-b780-790cd8d4da5d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.232767] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8f3220e-5cc2-4750-8852-ad133e8d0e7b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.243629] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4b0a6dd-d04c-47fa-b1c1-b4398bc4f410 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.251020] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7bfd8cf-83ad-4ed1-ab98-204a52536c40 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.282693] env[65918]: DEBUG oslo_concurrency.lockutils [None req-fb2be1a6-8e8c-4f90-8667-a6240f41667f tempest-AttachInterfacesV270Test-118737708 tempest-AttachInterfacesV270Test-118737708-project-member] Lock "bba6f3d9-1be3-4048-86d5-f435511b0fc0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1145.285551] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebe37ee8-886a-40b6-957d-25902557a64c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.291650] env[65918]: DEBUG oslo_vmware.api [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Task: {'id': task-2848235, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074283} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1145.292166] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1145.292354] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1145.292523] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1145.292688] env[65918]: INFO nova.compute.manager [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1145.321262] env[65918]: DEBUG nova.compute.claims [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1145.321429] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1145.322099] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a9f2e09-4a17-4208-88f7-b476f09f3517 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.324457] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ce0de37d-626b-42e3-af69-de456d9f5c8d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.330707] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e58a330a-0205-41e1-b85a-40393c388674 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.344275] env[65918]: DEBUG nova.compute.provider_tree [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1145.346663] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1145.352417] env[65918]: DEBUG nova.scheduler.client.report [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1145.365583] env[65918]: DEBUG oslo_concurrency.lockutils [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1145.366127] env[65918]: DEBUG nova.compute.manager [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Start building networks asynchronously for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1145.368258] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.047s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1145.390630] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.022s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1145.391699] env[65918]: DEBUG nova.compute.utils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Instance 3b3f8c10-5ba5-445c-a51d-5404874df3d9 could not be found. {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1145.392748] env[65918]: DEBUG nova.compute.manager [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Instance disappeared during build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1145.392894] env[65918]: DEBUG nova.compute.manager [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1145.393064] env[65918]: DEBUG nova.compute.manager [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1145.393234] env[65918]: DEBUG nova.compute.manager [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1145.393386] env[65918]: DEBUG nova.network.neutron [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1145.397378] env[65918]: DEBUG nova.compute.utils [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Using /dev/sd instead of None {{(pid=65918) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1145.398557] env[65918]: DEBUG nova.compute.manager [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Allocating IP information in the background. {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1145.398811] env[65918]: DEBUG nova.network.neutron [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] allocate_for_instance() {{(pid=65918) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1145.408936] env[65918]: DEBUG nova.compute.manager [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Start building block device mappings for instance. {{(pid=65918) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1145.422455] env[65918]: DEBUG neutronclient.v2_0.client [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65918) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1145.426221] env[65918]: ERROR nova.compute.manager [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1145.426221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Traceback (most recent call last): [ 1145.426221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1145.426221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1145.426221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1145.426221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] result = getattr(controller, method)(*args, **kwargs) [ 1145.426221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1145.426221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self._get(image_id) [ 1145.426221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1145.426221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1145.426221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1145.426221] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] resp, body = self.http_client.get(url, headers=header) [ 1145.426570] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1145.426570] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self.request(url, 'GET', **kwargs) [ 1145.426570] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1145.426570] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self._handle_response(resp) [ 1145.426570] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1145.426570] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] raise exc.from_response(resp, resp.content) [ 1145.426570] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1145.426570] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.426570] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] During handling of the above exception, another exception occurred: [ 1145.426570] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.426570] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Traceback (most recent call last): [ 1145.426570] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1145.426965] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] self.driver.spawn(context, instance, image_meta, [ 1145.426965] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1145.426965] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1145.426965] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1145.426965] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] self._fetch_image_if_missing(context, vi) [ 1145.426965] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1145.426965] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] image_fetch(context, vi, tmp_image_ds_loc) [ 1145.426965] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1145.426965] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] images.fetch_image( [ 1145.426965] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1145.426965] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] metadata = IMAGE_API.get(context, image_ref) [ 1145.426965] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1145.426965] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return session.show(context, image_id, [ 1145.427303] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1145.427303] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] _reraise_translated_image_exception(image_id) [ 1145.427303] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1145.427303] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] raise new_exc.with_traceback(exc_trace) [ 1145.427303] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1145.427303] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1145.427303] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1145.427303] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] result = getattr(controller, method)(*args, **kwargs) [ 1145.427303] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1145.427303] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self._get(image_id) [ 1145.427303] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1145.427303] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1145.427303] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1145.427627] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] resp, body = self.http_client.get(url, headers=header) [ 1145.427627] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1145.427627] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self.request(url, 'GET', **kwargs) [ 1145.427627] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1145.427627] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self._handle_response(resp) [ 1145.427627] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1145.427627] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] raise exc.from_response(resp, resp.content) [ 1145.427627] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1145.427627] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.427627] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] During handling of the above exception, another exception occurred: [ 1145.427627] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.427627] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Traceback (most recent call last): [ 1145.427627] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1145.427953] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] self._build_and_run_instance(context, instance, image, [ 1145.427953] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1145.427953] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] with excutils.save_and_reraise_exception(): [ 1145.427953] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1145.427953] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] self.force_reraise() [ 1145.427953] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1145.427953] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] raise self.value [ 1145.427953] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1145.427953] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] with self.rt.instance_claim(context, instance, node, allocs, [ 1145.427953] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1145.427953] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] self.abort() [ 1145.427953] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1145.427953] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1145.428310] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1145.428310] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return f(*args, **kwargs) [ 1145.428310] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1145.428310] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] self._unset_instance_host_and_node(instance) [ 1145.428310] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1145.428310] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] instance.save() [ 1145.428310] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1145.428310] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] updates, result = self.indirection_api.object_action( [ 1145.428310] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1145.428310] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return cctxt.call(context, 'object_action', objinst=objinst, [ 1145.428310] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1145.428310] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] result = self.transport._send( [ 1145.428723] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1145.428723] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self._driver.send(target, ctxt, message, [ 1145.428723] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1145.428723] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1145.428723] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1145.428723] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] raise result [ 1145.428723] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] nova.exception_Remote.InstanceNotFound_Remote: Instance 3b3f8c10-5ba5-445c-a51d-5404874df3d9 could not be found. [ 1145.428723] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Traceback (most recent call last): [ 1145.428723] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.428723] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1145.428723] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return getattr(target, method)(*args, **kwargs) [ 1145.428723] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.428723] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1145.429071] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return fn(self, *args, **kwargs) [ 1145.429071] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.429071] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1145.429071] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] old_ref, inst_ref = db.instance_update_and_get_original( [ 1145.429071] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.429071] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1145.429071] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return f(*args, **kwargs) [ 1145.429071] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.429071] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1145.429071] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] with excutils.save_and_reraise_exception() as ectxt: [ 1145.429071] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.429071] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1145.429071] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] self.force_reraise() [ 1145.429071] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.429071] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1145.429503] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] raise self.value [ 1145.429503] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.429503] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1145.429503] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return f(*args, **kwargs) [ 1145.429503] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.429503] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1145.429503] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return f(context, *args, **kwargs) [ 1145.429503] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.429503] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1145.429503] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1145.429503] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.429503] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1145.429503] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] raise exception.InstanceNotFound(instance_id=uuid) [ 1145.429503] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.429503] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] nova.exception.InstanceNotFound: Instance 3b3f8c10-5ba5-445c-a51d-5404874df3d9 could not be found. [ 1145.429903] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.429903] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.429903] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] During handling of the above exception, another exception occurred: [ 1145.429903] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.429903] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Traceback (most recent call last): [ 1145.429903] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1145.429903] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] ret = obj(*args, **kwargs) [ 1145.429903] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1145.429903] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] exception_handler_v20(status_code, error_body) [ 1145.429903] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1145.429903] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] raise client_exc(message=error_message, [ 1145.429903] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1145.429903] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Neutron server returns request_ids: ['req-0df79b8e-e2af-4d0c-8d6a-816e180ac3b6'] [ 1145.429903] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.430744] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] During handling of the above exception, another exception occurred: [ 1145.430744] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.430744] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Traceback (most recent call last): [ 1145.430744] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1145.430744] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] self._deallocate_network(context, instance, requested_networks) [ 1145.430744] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1145.430744] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] self.network_api.deallocate_for_instance( [ 1145.430744] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1145.430744] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] data = neutron.list_ports(**search_opts) [ 1145.430744] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1145.430744] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] ret = obj(*args, **kwargs) [ 1145.430744] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1145.430744] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self.list('ports', self.ports_path, retrieve_all, [ 1145.431273] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1145.431273] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] ret = obj(*args, **kwargs) [ 1145.431273] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1145.431273] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] for r in self._pagination(collection, path, **params): [ 1145.431273] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1145.431273] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] res = self.get(path, params=params) [ 1145.431273] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1145.431273] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] ret = obj(*args, **kwargs) [ 1145.431273] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1145.431273] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self.retry_request("GET", action, body=body, [ 1145.431273] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1145.431273] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] ret = obj(*args, **kwargs) [ 1145.431273] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1145.431706] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] return self.do_request(method, action, body=body, [ 1145.431706] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1145.431706] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] ret = obj(*args, **kwargs) [ 1145.431706] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1145.431706] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] self._handle_fault_response(status_code, replybody, resp) [ 1145.431706] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1145.431706] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] raise exception.Unauthorized() [ 1145.431706] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] nova.exception.Unauthorized: Not authorized. [ 1145.431706] env[65918]: ERROR nova.compute.manager [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] [ 1145.457828] env[65918]: DEBUG oslo_concurrency.lockutils [None req-33f6e45f-63c3-44cb-b698-b761c0cc4ba1 tempest-ServersTestManualDisk-1815490856 tempest-ServersTestManualDisk-1815490856-project-member] Lock "3b3f8c10-5ba5-445c-a51d-5404874df3d9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 445.443s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1145.473690] env[65918]: DEBUG nova.compute.manager [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Start spawning the instance on the hypervisor. {{(pid=65918) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1145.521452] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1145.522257] env[65918]: ERROR nova.compute.manager [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1145.522257] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Traceback (most recent call last): [ 1145.522257] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1145.522257] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1145.522257] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1145.522257] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] result = getattr(controller, method)(*args, **kwargs) [ 1145.522257] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1145.522257] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self._get(image_id) [ 1145.522257] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1145.522257] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1145.522257] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1145.522865] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] resp, body = self.http_client.get(url, headers=header) [ 1145.522865] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1145.522865] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self.request(url, 'GET', **kwargs) [ 1145.522865] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1145.522865] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self._handle_response(resp) [ 1145.522865] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1145.522865] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] raise exc.from_response(resp, resp.content) [ 1145.522865] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1145.522865] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1145.522865] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] During handling of the above exception, another exception occurred: [ 1145.522865] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1145.522865] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Traceback (most recent call last): [ 1145.523367] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1145.523367] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] yield resources [ 1145.523367] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1145.523367] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] self.driver.spawn(context, instance, image_meta, [ 1145.523367] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1145.523367] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1145.523367] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1145.523367] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] self._fetch_image_if_missing(context, vi) [ 1145.523367] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1145.523367] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] image_fetch(context, vi, tmp_image_ds_loc) [ 1145.523367] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1145.523367] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] images.fetch_image( [ 1145.523367] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1145.524025] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] metadata = IMAGE_API.get(context, image_ref) [ 1145.524025] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1145.524025] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return session.show(context, image_id, [ 1145.524025] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1145.524025] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] _reraise_translated_image_exception(image_id) [ 1145.524025] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1145.524025] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] raise new_exc.with_traceback(exc_trace) [ 1145.524025] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1145.524025] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1145.524025] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1145.524025] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] result = getattr(controller, method)(*args, **kwargs) [ 1145.524025] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1145.524025] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self._get(image_id) [ 1145.524349] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1145.524349] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1145.524349] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1145.524349] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] resp, body = self.http_client.get(url, headers=header) [ 1145.524349] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1145.524349] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self.request(url, 'GET', **kwargs) [ 1145.524349] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1145.524349] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self._handle_response(resp) [ 1145.524349] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1145.524349] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] raise exc.from_response(resp, resp.content) [ 1145.524349] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1145.524349] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1145.524636] env[65918]: INFO nova.compute.manager [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Terminating instance [ 1145.524636] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1145.524636] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1145.524636] env[65918]: DEBUG nova.compute.manager [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1145.524886] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1145.524998] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-eefbd782-44cb-48e5-a979-3137a50479ea {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.528314] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ff9af21-d1a5-4cf7-98be-c4692b823c80 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.537689] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1145.537884] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-aaf0224d-4d17-4e37-92f1-bb06d7c4bc0e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.541978] env[65918]: DEBUG nova.virt.hardware [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-03-07T07:52:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-03-07T07:52:33Z,direct_url=,disk_format='vmdk',id=e017c336-3a02-4b58-874a-44a1d1e154fd,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a06639ef48fd43768e273db76d6c8f54',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-03-07T07:52:33Z,virtual_size=,visibility=), allow threads: False {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1145.542426] env[65918]: DEBUG nova.virt.hardware [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Flavor limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1145.542426] env[65918]: DEBUG nova.virt.hardware [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Image limits 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1145.542829] env[65918]: DEBUG nova.virt.hardware [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Flavor pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1145.543040] env[65918]: DEBUG nova.virt.hardware [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Image pref 0:0:0 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1145.543203] env[65918]: DEBUG nova.virt.hardware [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=65918) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1145.543414] env[65918]: DEBUG nova.virt.hardware [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1145.543570] env[65918]: DEBUG nova.virt.hardware [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1145.543786] env[65918]: DEBUG nova.virt.hardware [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Got 1 possible topologies {{(pid=65918) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1145.543884] env[65918]: DEBUG nova.virt.hardware [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1145.544064] env[65918]: DEBUG nova.virt.hardware [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=65918) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1145.545753] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2549f056-29c6-4739-b984-8f2cf40b9bb5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.551532] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1145.551701] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1145.552897] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5a40630e-332f-4ad1-8994-46e5bebabdb0 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.558589] env[65918]: DEBUG oslo_vmware.api [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Waiting for the task: (returnval){ [ 1145.558589] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]526a165b-95a4-8a34-d157-df17b90118be" [ 1145.558589] env[65918]: _type = "Task" [ 1145.558589] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1145.564827] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73608de2-985a-4ddb-b7f8-f34316259228 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.580326] env[65918]: DEBUG nova.policy [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '77fa3b5aa8fe471480344bdb073149aa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33301c2fb41942968bbfec91576d4822', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=65918) authorize /opt/stack/nova/nova/policy.py:203}} [ 1145.586478] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1145.586478] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Creating directory with path [datastore1] vmware_temp/2b5fb2d7-bbcd-4470-a282-63abfca1bf58/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1145.586708] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a93df551-d913-4a88-8ada-44cebd4052ac {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.605073] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Created directory with path [datastore1] vmware_temp/2b5fb2d7-bbcd-4470-a282-63abfca1bf58/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1145.605306] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Fetch image to [datastore1] vmware_temp/2b5fb2d7-bbcd-4470-a282-63abfca1bf58/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1145.605488] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/2b5fb2d7-bbcd-4470-a282-63abfca1bf58/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1145.606221] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9960a4f-fe12-4660-9ef8-4bb1a954e72b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.613050] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb365bba-6547-4301-afe5-14707fa618c9 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.623148] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-199aa12e-0f6e-49e4-8365-312b2415088a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.626883] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1145.627127] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1145.627314] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Deleting the datastore file [datastore1] 0ccebca0-a1a4-48b2-9154-1c73350dab38 {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1145.627661] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2582a1aa-6b22-4c3b-8dc5-77da018d64c7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.658172] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-785c2a73-3b7b-4da8-adfd-701df907bec8 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.660742] env[65918]: DEBUG oslo_vmware.api [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Waiting for the task: (returnval){ [ 1145.660742] env[65918]: value = "task-2848237" [ 1145.660742] env[65918]: _type = "Task" [ 1145.660742] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1145.665932] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-373c8026-fbe5-4e65-9bab-b947deff6101 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.670108] env[65918]: DEBUG oslo_vmware.api [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Task: {'id': task-2848237, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1145.699275] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1145.802390] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1145.803301] env[65918]: ERROR nova.compute.manager [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1145.803301] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Traceback (most recent call last): [ 1145.803301] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1145.803301] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1145.803301] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1145.803301] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] result = getattr(controller, method)(*args, **kwargs) [ 1145.803301] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1145.803301] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self._get(image_id) [ 1145.803301] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1145.803301] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1145.803301] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1145.803768] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] resp, body = self.http_client.get(url, headers=header) [ 1145.803768] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1145.803768] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self.request(url, 'GET', **kwargs) [ 1145.803768] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1145.803768] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self._handle_response(resp) [ 1145.803768] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1145.803768] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] raise exc.from_response(resp, resp.content) [ 1145.803768] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1145.803768] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1145.803768] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] During handling of the above exception, another exception occurred: [ 1145.803768] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1145.803768] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Traceback (most recent call last): [ 1145.804277] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1145.804277] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] yield resources [ 1145.804277] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1145.804277] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] self.driver.spawn(context, instance, image_meta, [ 1145.804277] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1145.804277] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1145.804277] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1145.804277] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] self._fetch_image_if_missing(context, vi) [ 1145.804277] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1145.804277] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] image_fetch(context, vi, tmp_image_ds_loc) [ 1145.804277] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1145.804277] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] images.fetch_image( [ 1145.804277] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1145.805122] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] metadata = IMAGE_API.get(context, image_ref) [ 1145.805122] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1145.805122] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return session.show(context, image_id, [ 1145.805122] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1145.805122] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] _reraise_translated_image_exception(image_id) [ 1145.805122] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1145.805122] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] raise new_exc.with_traceback(exc_trace) [ 1145.805122] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1145.805122] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1145.805122] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1145.805122] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] result = getattr(controller, method)(*args, **kwargs) [ 1145.805122] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1145.805122] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self._get(image_id) [ 1145.805689] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1145.805689] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1145.805689] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1145.805689] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] resp, body = self.http_client.get(url, headers=header) [ 1145.805689] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1145.805689] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self.request(url, 'GET', **kwargs) [ 1145.805689] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1145.805689] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self._handle_response(resp) [ 1145.805689] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1145.805689] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] raise exc.from_response(resp, resp.content) [ 1145.805689] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1145.805689] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1145.806152] env[65918]: INFO nova.compute.manager [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Terminating instance [ 1145.806152] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1145.806152] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1145.806152] env[65918]: DEBUG nova.compute.manager [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1145.806152] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1145.806344] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5ab14b8c-9972-4576-9bd0-bf35d1573019 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.810936] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bd03df4-d49e-4083-92dc-1b90ccb9989f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.817951] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1145.818277] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3f7853dc-9079-48ef-8c1b-31a68edba6d8 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.820727] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1145.820894] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1145.821849] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-395f9c16-bb65-4263-a2b5-16897e5c3665 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.826447] env[65918]: DEBUG oslo_vmware.api [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Waiting for the task: (returnval){ [ 1145.826447] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52b7b67e-9972-2e2b-572e-3ca921062a18" [ 1145.826447] env[65918]: _type = "Task" [ 1145.826447] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1145.833938] env[65918]: DEBUG oslo_vmware.api [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52b7b67e-9972-2e2b-572e-3ca921062a18, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1145.877174] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1145.877432] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1145.877607] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Deleting the datastore file [datastore1] a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398 {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1145.877876] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2fec9ee2-8e52-4a2a-b471-4a9443aa6db1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1145.884168] env[65918]: DEBUG oslo_vmware.api [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Waiting for the task: (returnval){ [ 1145.884168] env[65918]: value = "task-2848239" [ 1145.884168] env[65918]: _type = "Task" [ 1145.884168] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1145.891464] env[65918]: DEBUG oslo_vmware.api [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Task: {'id': task-2848239, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1146.100514] env[65918]: DEBUG nova.network.neutron [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Successfully created port: c1dc3d30-74bf-4ad2-987e-f5a8e1b75667 {{(pid=65918) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1146.170745] env[65918]: DEBUG oslo_vmware.api [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Task: {'id': task-2848237, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067867} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1146.171026] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1146.172203] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1146.172203] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1146.172203] env[65918]: INFO nova.compute.manager [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Took 0.65 seconds to destroy the instance on the hypervisor. [ 1146.174172] env[65918]: DEBUG nova.compute.claims [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1146.174477] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1146.174580] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1146.204864] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1146.204864] env[65918]: DEBUG nova.compute.utils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Instance 0ccebca0-a1a4-48b2-9154-1c73350dab38 could not be found. {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1146.207591] env[65918]: DEBUG nova.compute.manager [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Instance disappeared during build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1146.207679] env[65918]: DEBUG nova.compute.manager [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1146.207833] env[65918]: DEBUG nova.compute.manager [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1146.208007] env[65918]: DEBUG nova.compute.manager [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1146.208710] env[65918]: DEBUG nova.network.neutron [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1146.335983] env[65918]: DEBUG neutronclient.v2_0.client [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65918) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1146.337847] env[65918]: ERROR nova.compute.manager [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1146.337847] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Traceback (most recent call last): [ 1146.337847] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1146.337847] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1146.337847] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1146.337847] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] result = getattr(controller, method)(*args, **kwargs) [ 1146.337847] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1146.337847] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self._get(image_id) [ 1146.337847] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1146.337847] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1146.337847] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1146.337847] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] resp, body = self.http_client.get(url, headers=header) [ 1146.338299] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1146.338299] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self.request(url, 'GET', **kwargs) [ 1146.338299] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1146.338299] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self._handle_response(resp) [ 1146.338299] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1146.338299] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] raise exc.from_response(resp, resp.content) [ 1146.338299] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1146.338299] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.338299] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] During handling of the above exception, another exception occurred: [ 1146.338299] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.338299] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Traceback (most recent call last): [ 1146.338299] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1146.338673] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] self.driver.spawn(context, instance, image_meta, [ 1146.338673] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1146.338673] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1146.338673] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1146.338673] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] self._fetch_image_if_missing(context, vi) [ 1146.338673] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1146.338673] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] image_fetch(context, vi, tmp_image_ds_loc) [ 1146.338673] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1146.338673] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] images.fetch_image( [ 1146.338673] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1146.338673] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] metadata = IMAGE_API.get(context, image_ref) [ 1146.338673] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1146.338673] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return session.show(context, image_id, [ 1146.339075] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1146.339075] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] _reraise_translated_image_exception(image_id) [ 1146.339075] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1146.339075] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] raise new_exc.with_traceback(exc_trace) [ 1146.339075] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1146.339075] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1146.339075] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1146.339075] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] result = getattr(controller, method)(*args, **kwargs) [ 1146.339075] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1146.339075] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self._get(image_id) [ 1146.339075] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1146.339075] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1146.339075] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1146.339491] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] resp, body = self.http_client.get(url, headers=header) [ 1146.339491] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1146.339491] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self.request(url, 'GET', **kwargs) [ 1146.339491] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1146.339491] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self._handle_response(resp) [ 1146.339491] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1146.339491] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] raise exc.from_response(resp, resp.content) [ 1146.339491] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1146.339491] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.339491] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] During handling of the above exception, another exception occurred: [ 1146.339491] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.339491] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Traceback (most recent call last): [ 1146.339491] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1146.339861] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] self._build_and_run_instance(context, instance, image, [ 1146.339861] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1146.339861] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] with excutils.save_and_reraise_exception(): [ 1146.339861] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1146.339861] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] self.force_reraise() [ 1146.339861] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1146.339861] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] raise self.value [ 1146.339861] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1146.339861] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] with self.rt.instance_claim(context, instance, node, allocs, [ 1146.339861] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1146.339861] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] self.abort() [ 1146.339861] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1146.339861] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1146.340278] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1146.340278] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return f(*args, **kwargs) [ 1146.340278] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1146.340278] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] self._unset_instance_host_and_node(instance) [ 1146.340278] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1146.340278] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] instance.save() [ 1146.340278] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1146.340278] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] updates, result = self.indirection_api.object_action( [ 1146.340278] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1146.340278] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return cctxt.call(context, 'object_action', objinst=objinst, [ 1146.340278] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1146.340278] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] result = self.transport._send( [ 1146.340672] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1146.340672] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self._driver.send(target, ctxt, message, [ 1146.340672] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1146.340672] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1146.340672] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1146.340672] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] raise result [ 1146.340672] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] nova.exception_Remote.InstanceNotFound_Remote: Instance 0ccebca0-a1a4-48b2-9154-1c73350dab38 could not be found. [ 1146.340672] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Traceback (most recent call last): [ 1146.340672] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.340672] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1146.340672] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return getattr(target, method)(*args, **kwargs) [ 1146.340672] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.340672] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1146.342851] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return fn(self, *args, **kwargs) [ 1146.342851] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.342851] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1146.342851] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] old_ref, inst_ref = db.instance_update_and_get_original( [ 1146.342851] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.342851] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1146.342851] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return f(*args, **kwargs) [ 1146.342851] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.342851] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1146.342851] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] with excutils.save_and_reraise_exception() as ectxt: [ 1146.342851] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.342851] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1146.342851] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] self.force_reraise() [ 1146.342851] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.342851] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1146.343346] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] raise self.value [ 1146.343346] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.343346] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1146.343346] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return f(*args, **kwargs) [ 1146.343346] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.343346] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1146.343346] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return f(context, *args, **kwargs) [ 1146.343346] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.343346] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1146.343346] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1146.343346] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.343346] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1146.343346] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] raise exception.InstanceNotFound(instance_id=uuid) [ 1146.343346] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.343346] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] nova.exception.InstanceNotFound: Instance 0ccebca0-a1a4-48b2-9154-1c73350dab38 could not be found. [ 1146.343774] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.343774] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.343774] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] During handling of the above exception, another exception occurred: [ 1146.343774] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.343774] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Traceback (most recent call last): [ 1146.343774] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1146.343774] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] ret = obj(*args, **kwargs) [ 1146.343774] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1146.343774] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] exception_handler_v20(status_code, error_body) [ 1146.343774] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1146.343774] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] raise client_exc(message=error_message, [ 1146.343774] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1146.343774] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Neutron server returns request_ids: ['req-478a4f05-3303-4c11-b96d-40b1f1f6900f'] [ 1146.343774] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.344175] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] During handling of the above exception, another exception occurred: [ 1146.344175] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.344175] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Traceback (most recent call last): [ 1146.344175] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1146.344175] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] self._deallocate_network(context, instance, requested_networks) [ 1146.344175] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1146.344175] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] self.network_api.deallocate_for_instance( [ 1146.344175] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1146.344175] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] data = neutron.list_ports(**search_opts) [ 1146.344175] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1146.344175] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] ret = obj(*args, **kwargs) [ 1146.344175] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1146.344175] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self.list('ports', self.ports_path, retrieve_all, [ 1146.345093] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1146.345093] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] ret = obj(*args, **kwargs) [ 1146.345093] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1146.345093] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] for r in self._pagination(collection, path, **params): [ 1146.345093] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1146.345093] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] res = self.get(path, params=params) [ 1146.345093] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1146.345093] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] ret = obj(*args, **kwargs) [ 1146.345093] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1146.345093] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self.retry_request("GET", action, body=body, [ 1146.345093] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1146.345093] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] ret = obj(*args, **kwargs) [ 1146.345093] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1146.345471] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] return self.do_request(method, action, body=body, [ 1146.345471] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1146.345471] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] ret = obj(*args, **kwargs) [ 1146.345471] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1146.345471] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] self._handle_fault_response(status_code, replybody, resp) [ 1146.345471] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1146.345471] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] raise exception.Unauthorized() [ 1146.345471] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] nova.exception.Unauthorized: Not authorized. [ 1146.345471] env[65918]: ERROR nova.compute.manager [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] [ 1146.353336] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1146.353570] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Creating directory with path [datastore1] vmware_temp/cb6db229-37bd-4552-8e80-e37c38af1216/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1146.353809] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c3bfeead-63a1-42b8-9814-8dce493dbd21 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.368259] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Created directory with path [datastore1] vmware_temp/cb6db229-37bd-4552-8e80-e37c38af1216/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1146.368468] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Fetch image to [datastore1] vmware_temp/cb6db229-37bd-4552-8e80-e37c38af1216/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1146.368635] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/cb6db229-37bd-4552-8e80-e37c38af1216/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1146.369408] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25e166d6-e498-494b-a9d3-31e55332c591 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.377502] env[65918]: DEBUG oslo_concurrency.lockutils [None req-b74f537b-eb34-4756-80b2-96cb1a8e586c tempest-ServerDiskConfigTestJSON-1421108446 tempest-ServerDiskConfigTestJSON-1421108446-project-member] Lock "0ccebca0-a1a4-48b2-9154-1c73350dab38" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 444.200s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1146.378226] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16593d08-68d0-4a33-bd6b-00df9adfaa05 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.391020] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83bf9498-e22c-4bfe-b0be-71b0e8734587 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.400201] env[65918]: DEBUG oslo_vmware.api [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Task: {'id': task-2848239, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063608} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1146.426423] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1146.426690] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1146.426921] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1146.427149] env[65918]: INFO nova.compute.manager [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1146.429310] env[65918]: DEBUG nova.compute.claims [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1146.429544] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1146.429808] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1146.432867] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b65c9410-20cd-4e1c-8aad-6d29765d2b8a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.442779] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fa79b35e-d7bf-4d8b-b2dd-15157fa785e2 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.462948] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.033s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1146.464134] env[65918]: DEBUG nova.compute.utils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Instance a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398 could not be found. {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1146.467153] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1146.469229] env[65918]: DEBUG nova.compute.manager [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Instance disappeared during build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1146.469446] env[65918]: DEBUG nova.compute.manager [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1146.469646] env[65918]: DEBUG nova.compute.manager [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1146.469868] env[65918]: DEBUG nova.compute.manager [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1146.470073] env[65918]: DEBUG nova.network.neutron [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1146.507551] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1146.507551] env[65918]: ERROR nova.compute.manager [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1146.507551] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Traceback (most recent call last): [ 1146.507551] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1146.507551] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1146.507551] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1146.507551] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] result = getattr(controller, method)(*args, **kwargs) [ 1146.507551] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1146.507551] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self._get(image_id) [ 1146.507942] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1146.507942] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1146.507942] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1146.507942] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] resp, body = self.http_client.get(url, headers=header) [ 1146.507942] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1146.507942] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self.request(url, 'GET', **kwargs) [ 1146.507942] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1146.507942] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self._handle_response(resp) [ 1146.507942] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1146.507942] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] raise exc.from_response(resp, resp.content) [ 1146.507942] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1146.508472] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1146.508472] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] During handling of the above exception, another exception occurred: [ 1146.508472] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1146.508472] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Traceback (most recent call last): [ 1146.508472] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1146.508472] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] yield resources [ 1146.508472] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1146.508472] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] self.driver.spawn(context, instance, image_meta, [ 1146.508472] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1146.508472] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1146.508472] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1146.508472] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] self._fetch_image_if_missing(context, vi) [ 1146.508472] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1146.508472] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] image_fetch(context, vi, tmp_image_ds_loc) [ 1146.509584] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1146.509584] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] images.fetch_image( [ 1146.509584] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1146.509584] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] metadata = IMAGE_API.get(context, image_ref) [ 1146.509584] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1146.509584] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return session.show(context, image_id, [ 1146.509584] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1146.509584] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] _reraise_translated_image_exception(image_id) [ 1146.509584] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1146.509584] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] raise new_exc.with_traceback(exc_trace) [ 1146.509584] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1146.509584] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1146.509584] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1146.510073] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] result = getattr(controller, method)(*args, **kwargs) [ 1146.510073] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1146.510073] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self._get(image_id) [ 1146.510073] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1146.510073] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1146.510073] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1146.510073] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] resp, body = self.http_client.get(url, headers=header) [ 1146.510073] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1146.510073] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self.request(url, 'GET', **kwargs) [ 1146.510073] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1146.510073] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self._handle_response(resp) [ 1146.510073] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1146.510508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] raise exc.from_response(resp, resp.content) [ 1146.510508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1146.510508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1146.510508] env[65918]: INFO nova.compute.manager [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Terminating instance [ 1146.510508] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1146.510508] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1146.510754] env[65918]: DEBUG nova.compute.manager [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1146.510894] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1146.511359] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-28510df9-ee8d-450d-a4fc-8b56810f1fd1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.516790] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d13dea5a-29e4-4854-9a80-3a7cd3c866c1 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.523021] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1146.523021] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a7293e3e-25ae-4221-9df1-151f419c55fa {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.524008] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1146.524183] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1146.525107] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4745e6fb-5efb-45c3-95b8-0111e80c90ec {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.530126] env[65918]: DEBUG oslo_vmware.api [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Waiting for the task: (returnval){ [ 1146.530126] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52d0b07b-49cc-0b5a-dccc-13d7da31db43" [ 1146.530126] env[65918]: _type = "Task" [ 1146.530126] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1146.538682] env[65918]: DEBUG oslo_vmware.api [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52d0b07b-49cc-0b5a-dccc-13d7da31db43, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1146.591133] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1146.591133] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1146.591133] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Deleting the datastore file [datastore1] c04e5253-0275-4fb3-8eca-6a395c95930f {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1146.591133] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-97a6b5f5-255d-43e2-83d1-c6e40864b976 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.596222] env[65918]: DEBUG oslo_vmware.api [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Waiting for the task: (returnval){ [ 1146.596222] env[65918]: value = "task-2848241" [ 1146.596222] env[65918]: _type = "Task" [ 1146.596222] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1146.597833] env[65918]: DEBUG neutronclient.v2_0.client [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65918) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1146.598776] env[65918]: ERROR nova.compute.manager [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1146.598776] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Traceback (most recent call last): [ 1146.598776] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1146.598776] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1146.598776] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1146.598776] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] result = getattr(controller, method)(*args, **kwargs) [ 1146.598776] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1146.598776] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self._get(image_id) [ 1146.598776] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1146.598776] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1146.598776] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1146.598776] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] resp, body = self.http_client.get(url, headers=header) [ 1146.599221] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1146.599221] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self.request(url, 'GET', **kwargs) [ 1146.599221] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1146.599221] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self._handle_response(resp) [ 1146.599221] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1146.599221] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] raise exc.from_response(resp, resp.content) [ 1146.599221] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1146.599221] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.599221] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] During handling of the above exception, another exception occurred: [ 1146.599221] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.599221] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Traceback (most recent call last): [ 1146.599221] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1146.599568] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] self.driver.spawn(context, instance, image_meta, [ 1146.599568] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1146.599568] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1146.599568] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1146.599568] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] self._fetch_image_if_missing(context, vi) [ 1146.599568] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1146.599568] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] image_fetch(context, vi, tmp_image_ds_loc) [ 1146.599568] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1146.599568] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] images.fetch_image( [ 1146.599568] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1146.599568] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] metadata = IMAGE_API.get(context, image_ref) [ 1146.599568] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1146.599568] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return session.show(context, image_id, [ 1146.599948] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1146.599948] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] _reraise_translated_image_exception(image_id) [ 1146.599948] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1146.599948] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] raise new_exc.with_traceback(exc_trace) [ 1146.599948] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1146.599948] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1146.599948] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1146.599948] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] result = getattr(controller, method)(*args, **kwargs) [ 1146.599948] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1146.599948] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self._get(image_id) [ 1146.599948] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1146.599948] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1146.599948] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1146.600300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] resp, body = self.http_client.get(url, headers=header) [ 1146.600300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1146.600300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self.request(url, 'GET', **kwargs) [ 1146.600300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1146.600300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self._handle_response(resp) [ 1146.600300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1146.600300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] raise exc.from_response(resp, resp.content) [ 1146.600300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1146.600300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.600300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] During handling of the above exception, another exception occurred: [ 1146.600300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.600300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Traceback (most recent call last): [ 1146.600300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1146.600679] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] self._build_and_run_instance(context, instance, image, [ 1146.600679] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1146.600679] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] with excutils.save_and_reraise_exception(): [ 1146.600679] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1146.600679] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] self.force_reraise() [ 1146.600679] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1146.600679] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] raise self.value [ 1146.600679] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1146.600679] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] with self.rt.instance_claim(context, instance, node, allocs, [ 1146.600679] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1146.600679] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] self.abort() [ 1146.600679] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1146.600679] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1146.601082] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1146.601082] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return f(*args, **kwargs) [ 1146.601082] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1146.601082] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] self._unset_instance_host_and_node(instance) [ 1146.601082] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1146.601082] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] instance.save() [ 1146.601082] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1146.601082] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] updates, result = self.indirection_api.object_action( [ 1146.601082] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1146.601082] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return cctxt.call(context, 'object_action', objinst=objinst, [ 1146.601082] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1146.601082] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] result = self.transport._send( [ 1146.601416] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1146.601416] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self._driver.send(target, ctxt, message, [ 1146.601416] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1146.601416] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1146.601416] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1146.601416] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] raise result [ 1146.601416] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] nova.exception_Remote.InstanceNotFound_Remote: Instance a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398 could not be found. [ 1146.601416] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Traceback (most recent call last): [ 1146.601416] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.601416] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1146.601416] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return getattr(target, method)(*args, **kwargs) [ 1146.601416] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.601416] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1146.601784] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return fn(self, *args, **kwargs) [ 1146.601784] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.601784] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1146.601784] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] old_ref, inst_ref = db.instance_update_and_get_original( [ 1146.601784] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.601784] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1146.601784] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return f(*args, **kwargs) [ 1146.601784] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.601784] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1146.601784] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] with excutils.save_and_reraise_exception() as ectxt: [ 1146.601784] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.601784] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1146.601784] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] self.force_reraise() [ 1146.601784] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.601784] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1146.603496] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] raise self.value [ 1146.603496] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.603496] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1146.603496] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return f(*args, **kwargs) [ 1146.603496] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.603496] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1146.603496] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return f(context, *args, **kwargs) [ 1146.603496] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.603496] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1146.603496] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1146.603496] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.603496] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1146.603496] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] raise exception.InstanceNotFound(instance_id=uuid) [ 1146.603496] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.603496] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] nova.exception.InstanceNotFound: Instance a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398 could not be found. [ 1146.603942] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.603942] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.603942] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] During handling of the above exception, another exception occurred: [ 1146.603942] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.603942] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Traceback (most recent call last): [ 1146.603942] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1146.603942] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] ret = obj(*args, **kwargs) [ 1146.603942] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1146.603942] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] exception_handler_v20(status_code, error_body) [ 1146.603942] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1146.603942] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] raise client_exc(message=error_message, [ 1146.603942] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1146.603942] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Neutron server returns request_ids: ['req-885bef99-a960-43b9-a730-5ad28ba5a347'] [ 1146.603942] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.604300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] During handling of the above exception, another exception occurred: [ 1146.604300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.604300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Traceback (most recent call last): [ 1146.604300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1146.604300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] self._deallocate_network(context, instance, requested_networks) [ 1146.604300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1146.604300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] self.network_api.deallocate_for_instance( [ 1146.604300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1146.604300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] data = neutron.list_ports(**search_opts) [ 1146.604300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1146.604300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] ret = obj(*args, **kwargs) [ 1146.604300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1146.604300] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self.list('ports', self.ports_path, retrieve_all, [ 1146.604629] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1146.604629] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] ret = obj(*args, **kwargs) [ 1146.604629] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1146.604629] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] for r in self._pagination(collection, path, **params): [ 1146.604629] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1146.604629] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] res = self.get(path, params=params) [ 1146.604629] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1146.604629] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] ret = obj(*args, **kwargs) [ 1146.604629] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1146.604629] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self.retry_request("GET", action, body=body, [ 1146.604629] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1146.604629] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] ret = obj(*args, **kwargs) [ 1146.604629] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1146.605054] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] return self.do_request(method, action, body=body, [ 1146.605054] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1146.605054] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] ret = obj(*args, **kwargs) [ 1146.605054] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1146.605054] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] self._handle_fault_response(status_code, replybody, resp) [ 1146.605054] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1146.605054] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] raise exception.Unauthorized() [ 1146.605054] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] nova.exception.Unauthorized: Not authorized. [ 1146.605054] env[65918]: ERROR nova.compute.manager [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] [ 1146.615386] env[65918]: DEBUG oslo_vmware.api [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Task: {'id': task-2848241, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1146.635227] env[65918]: DEBUG oslo_concurrency.lockutils [None req-e4f09c41-237c-4300-9327-559fa5e8c45d tempest-ServerGroupTestJSON-1521414906 tempest-ServerGroupTestJSON-1521414906-project-member] Lock "a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 442.255s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1147.041992] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1147.041992] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Creating directory with path [datastore1] vmware_temp/1638514a-40a9-46d3-afe1-83d04937714e/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1147.042452] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-348d3a57-78f9-4fe1-92c3-25c75df133e9 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.054831] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Created directory with path [datastore1] vmware_temp/1638514a-40a9-46d3-afe1-83d04937714e/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1147.056025] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Fetch image to [datastore1] vmware_temp/1638514a-40a9-46d3-afe1-83d04937714e/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1147.056025] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/1638514a-40a9-46d3-afe1-83d04937714e/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1147.056025] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80c3421d-9492-4b08-8e24-1a55d9b8451f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.063069] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b1102cd-a517-4c3c-bce0-de9fd8ab0e16 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.073176] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9053acf6-cf19-489a-a354-f3ec8f24c29c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.077882] env[65918]: DEBUG nova.compute.manager [req-de65ff63-c5b3-4afe-a9d4-8604d977f55b req-be147cf0-82a5-4249-9ddf-9a6673b1dbe1 service nova] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Received event network-vif-plugged-c1dc3d30-74bf-4ad2-987e-f5a8e1b75667 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1147.078086] env[65918]: DEBUG oslo_concurrency.lockutils [req-de65ff63-c5b3-4afe-a9d4-8604d977f55b req-be147cf0-82a5-4249-9ddf-9a6673b1dbe1 service nova] Acquiring lock "81fef129-8f9a-4a19-afc0-f27411c36159-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1147.078349] env[65918]: DEBUG oslo_concurrency.lockutils [req-de65ff63-c5b3-4afe-a9d4-8604d977f55b req-be147cf0-82a5-4249-9ddf-9a6673b1dbe1 service nova] Lock "81fef129-8f9a-4a19-afc0-f27411c36159-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1147.078503] env[65918]: DEBUG oslo_concurrency.lockutils [req-de65ff63-c5b3-4afe-a9d4-8604d977f55b req-be147cf0-82a5-4249-9ddf-9a6673b1dbe1 service nova] Lock "81fef129-8f9a-4a19-afc0-f27411c36159-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1147.078661] env[65918]: DEBUG nova.compute.manager [req-de65ff63-c5b3-4afe-a9d4-8604d977f55b req-be147cf0-82a5-4249-9ddf-9a6673b1dbe1 service nova] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] No waiting events found dispatching network-vif-plugged-c1dc3d30-74bf-4ad2-987e-f5a8e1b75667 {{(pid=65918) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1147.078814] env[65918]: WARNING nova.compute.manager [req-de65ff63-c5b3-4afe-a9d4-8604d977f55b req-be147cf0-82a5-4249-9ddf-9a6673b1dbe1 service nova] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Received unexpected event network-vif-plugged-c1dc3d30-74bf-4ad2-987e-f5a8e1b75667 for instance with vm_state building and task_state spawning. [ 1147.111149] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b985e7ae-516c-40db-a2a1-a8e7c5a620ad {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.121468] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6740807c-e3fa-43c8-ba09-1382389058d8 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.123263] env[65918]: DEBUG oslo_vmware.api [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Task: {'id': task-2848241, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066491} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1147.123491] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1147.123768] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1147.123945] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1147.124125] env[65918]: INFO nova.compute.manager [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1147.126298] env[65918]: DEBUG nova.compute.claims [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1147.126481] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1147.126689] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1147.145032] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1147.150663] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1147.151296] env[65918]: DEBUG nova.compute.utils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Instance c04e5253-0275-4fb3-8eca-6a395c95930f could not be found. {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1147.153872] env[65918]: DEBUG nova.compute.manager [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Instance disappeared during build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1147.153872] env[65918]: DEBUG nova.compute.manager [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1147.153872] env[65918]: DEBUG nova.compute.manager [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1147.153872] env[65918]: DEBUG nova.compute.manager [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1147.154631] env[65918]: DEBUG nova.network.neutron [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1147.155928] env[65918]: DEBUG nova.network.neutron [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Successfully updated port: c1dc3d30-74bf-4ad2-987e-f5a8e1b75667 {{(pid=65918) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1147.165896] env[65918]: DEBUG oslo_concurrency.lockutils [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquiring lock "refresh_cache-81fef129-8f9a-4a19-afc0-f27411c36159" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1147.166068] env[65918]: DEBUG oslo_concurrency.lockutils [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquired lock "refresh_cache-81fef129-8f9a-4a19-afc0-f27411c36159" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1147.166229] env[65918]: DEBUG nova.network.neutron [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1147.198913] env[65918]: DEBUG neutronclient.v2_0.client [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65918) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1147.200460] env[65918]: ERROR nova.compute.manager [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1147.200460] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Traceback (most recent call last): [ 1147.200460] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1147.200460] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1147.200460] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1147.200460] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] result = getattr(controller, method)(*args, **kwargs) [ 1147.200460] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1147.200460] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self._get(image_id) [ 1147.200460] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1147.200460] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1147.200460] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1147.200865] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] resp, body = self.http_client.get(url, headers=header) [ 1147.200865] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1147.200865] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self.request(url, 'GET', **kwargs) [ 1147.200865] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1147.200865] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self._handle_response(resp) [ 1147.200865] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1147.200865] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] raise exc.from_response(resp, resp.content) [ 1147.200865] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1147.200865] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.200865] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] During handling of the above exception, another exception occurred: [ 1147.200865] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.200865] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Traceback (most recent call last): [ 1147.201190] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1147.201190] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] self.driver.spawn(context, instance, image_meta, [ 1147.201190] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1147.201190] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1147.201190] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1147.201190] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] self._fetch_image_if_missing(context, vi) [ 1147.201190] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1147.201190] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] image_fetch(context, vi, tmp_image_ds_loc) [ 1147.201190] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1147.201190] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] images.fetch_image( [ 1147.201190] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1147.201190] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] metadata = IMAGE_API.get(context, image_ref) [ 1147.201190] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1147.201508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return session.show(context, image_id, [ 1147.201508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1147.201508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] _reraise_translated_image_exception(image_id) [ 1147.201508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1147.201508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] raise new_exc.with_traceback(exc_trace) [ 1147.201508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1147.201508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1147.201508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1147.201508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] result = getattr(controller, method)(*args, **kwargs) [ 1147.201508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1147.201508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self._get(image_id) [ 1147.201508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1147.201508] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1147.201821] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1147.201821] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] resp, body = self.http_client.get(url, headers=header) [ 1147.201821] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1147.201821] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self.request(url, 'GET', **kwargs) [ 1147.201821] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1147.201821] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self._handle_response(resp) [ 1147.201821] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1147.201821] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] raise exc.from_response(resp, resp.content) [ 1147.201821] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1147.201821] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.201821] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] During handling of the above exception, another exception occurred: [ 1147.201821] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.201821] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Traceback (most recent call last): [ 1147.202157] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1147.202157] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] self._build_and_run_instance(context, instance, image, [ 1147.202157] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1147.202157] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] with excutils.save_and_reraise_exception(): [ 1147.202157] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1147.202157] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] self.force_reraise() [ 1147.202157] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1147.202157] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] raise self.value [ 1147.202157] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1147.202157] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] with self.rt.instance_claim(context, instance, node, allocs, [ 1147.202157] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1147.202157] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] self.abort() [ 1147.202157] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1147.202463] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1147.202463] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1147.202463] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return f(*args, **kwargs) [ 1147.202463] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1147.202463] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] self._unset_instance_host_and_node(instance) [ 1147.202463] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1147.202463] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] instance.save() [ 1147.202463] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1147.202463] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] updates, result = self.indirection_api.object_action( [ 1147.202463] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1147.202463] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return cctxt.call(context, 'object_action', objinst=objinst, [ 1147.202463] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1147.202463] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] result = self.transport._send( [ 1147.202775] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1147.202775] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self._driver.send(target, ctxt, message, [ 1147.202775] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1147.202775] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1147.202775] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1147.202775] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] raise result [ 1147.202775] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] nova.exception_Remote.InstanceNotFound_Remote: Instance c04e5253-0275-4fb3-8eca-6a395c95930f could not be found. [ 1147.202775] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Traceback (most recent call last): [ 1147.202775] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.202775] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1147.202775] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return getattr(target, method)(*args, **kwargs) [ 1147.202775] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.202775] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1147.203169] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return fn(self, *args, **kwargs) [ 1147.203169] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.203169] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1147.203169] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] old_ref, inst_ref = db.instance_update_and_get_original( [ 1147.203169] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.203169] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1147.203169] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return f(*args, **kwargs) [ 1147.203169] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.203169] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1147.203169] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] with excutils.save_and_reraise_exception() as ectxt: [ 1147.203169] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.203169] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1147.203169] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] self.force_reraise() [ 1147.203169] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.203169] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1147.203525] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] raise self.value [ 1147.203525] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.203525] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1147.203525] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return f(*args, **kwargs) [ 1147.203525] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.203525] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1147.203525] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return f(context, *args, **kwargs) [ 1147.203525] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.203525] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1147.203525] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1147.203525] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.203525] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1147.203525] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] raise exception.InstanceNotFound(instance_id=uuid) [ 1147.203525] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.203525] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] nova.exception.InstanceNotFound: Instance c04e5253-0275-4fb3-8eca-6a395c95930f could not be found. [ 1147.203874] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.203874] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.203874] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] During handling of the above exception, another exception occurred: [ 1147.203874] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.203874] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Traceback (most recent call last): [ 1147.203874] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1147.203874] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] ret = obj(*args, **kwargs) [ 1147.203874] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1147.203874] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] exception_handler_v20(status_code, error_body) [ 1147.203874] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1147.203874] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] raise client_exc(message=error_message, [ 1147.203874] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1147.203874] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Neutron server returns request_ids: ['req-c643d799-f236-4c01-8e6d-5f09c0e4c41f'] [ 1147.203874] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.204953] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] During handling of the above exception, another exception occurred: [ 1147.204953] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.204953] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Traceback (most recent call last): [ 1147.204953] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1147.204953] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] self._deallocate_network(context, instance, requested_networks) [ 1147.204953] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1147.204953] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] self.network_api.deallocate_for_instance( [ 1147.204953] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1147.204953] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] data = neutron.list_ports(**search_opts) [ 1147.204953] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1147.204953] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] ret = obj(*args, **kwargs) [ 1147.204953] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1147.204953] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self.list('ports', self.ports_path, retrieve_all, [ 1147.205295] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1147.205295] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] ret = obj(*args, **kwargs) [ 1147.205295] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1147.205295] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] for r in self._pagination(collection, path, **params): [ 1147.205295] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1147.205295] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] res = self.get(path, params=params) [ 1147.205295] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1147.205295] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] ret = obj(*args, **kwargs) [ 1147.205295] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1147.205295] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self.retry_request("GET", action, body=body, [ 1147.205295] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1147.205295] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] ret = obj(*args, **kwargs) [ 1147.205295] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1147.205646] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] return self.do_request(method, action, body=body, [ 1147.205646] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1147.205646] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] ret = obj(*args, **kwargs) [ 1147.205646] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1147.205646] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] self._handle_fault_response(status_code, replybody, resp) [ 1147.205646] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1147.205646] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] raise exception.Unauthorized() [ 1147.205646] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] nova.exception.Unauthorized: Not authorized. [ 1147.205646] env[65918]: ERROR nova.compute.manager [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] [ 1147.207761] env[65918]: DEBUG nova.network.neutron [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1147.222807] env[65918]: DEBUG oslo_concurrency.lockutils [None req-7513f63b-c258-487f-8769-846f18b00309 tempest-InstanceActionsNegativeTestJSON-1671902162 tempest-InstanceActionsNegativeTestJSON-1671902162-project-member] Lock "c04e5253-0275-4fb3-8eca-6a395c95930f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 442.450s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1147.284874] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1147.285648] env[65918]: ERROR nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1147.285648] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Traceback (most recent call last): [ 1147.285648] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1147.285648] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1147.285648] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1147.285648] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] result = getattr(controller, method)(*args, **kwargs) [ 1147.285648] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1147.285648] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return self._get(image_id) [ 1147.285648] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1147.285648] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1147.285648] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1147.286091] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] resp, body = self.http_client.get(url, headers=header) [ 1147.286091] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1147.286091] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return self.request(url, 'GET', **kwargs) [ 1147.286091] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1147.286091] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return self._handle_response(resp) [ 1147.286091] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1147.286091] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] raise exc.from_response(resp, resp.content) [ 1147.286091] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1147.286091] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1147.286091] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] During handling of the above exception, another exception occurred: [ 1147.286091] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1147.286091] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Traceback (most recent call last): [ 1147.286452] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1147.286452] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] yield resources [ 1147.286452] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1147.286452] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self.driver.spawn(context, instance, image_meta, [ 1147.286452] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1147.286452] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1147.286452] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1147.286452] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self._fetch_image_if_missing(context, vi) [ 1147.286452] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1147.286452] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] image_fetch(context, vi, tmp_image_ds_loc) [ 1147.286452] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1147.286452] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] images.fetch_image( [ 1147.286452] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1147.286840] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] metadata = IMAGE_API.get(context, image_ref) [ 1147.286840] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1147.286840] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return session.show(context, image_id, [ 1147.286840] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1147.286840] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] _reraise_translated_image_exception(image_id) [ 1147.286840] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1147.286840] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] raise new_exc.with_traceback(exc_trace) [ 1147.286840] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1147.286840] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1147.286840] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1147.286840] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] result = getattr(controller, method)(*args, **kwargs) [ 1147.286840] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1147.286840] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return self._get(image_id) [ 1147.287400] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1147.287400] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1147.287400] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1147.287400] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] resp, body = self.http_client.get(url, headers=header) [ 1147.287400] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1147.287400] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return self.request(url, 'GET', **kwargs) [ 1147.287400] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1147.287400] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return self._handle_response(resp) [ 1147.287400] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1147.287400] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] raise exc.from_response(resp, resp.content) [ 1147.287400] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1147.287400] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1147.287722] env[65918]: INFO nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Terminating instance [ 1147.287918] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1147.288171] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1147.288680] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Acquiring lock "refresh_cache-46e1dfe1-df73-430c-85ef-f5753974eed0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1147.288831] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Acquired lock "refresh_cache-46e1dfe1-df73-430c-85ef-f5753974eed0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1147.288990] env[65918]: DEBUG nova.network.neutron [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1147.289893] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d5308465-cf7e-4529-b087-42b45c2f6afa {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.297344] env[65918]: DEBUG nova.compute.utils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Can not refresh info_cache because instance was not found {{(pid=65918) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1147.300812] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1147.301100] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1147.302112] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-890f13bd-6e0b-47b2-9759-87bc8e73b13a {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.308356] env[65918]: DEBUG oslo_vmware.api [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Waiting for the task: (returnval){ [ 1147.308356] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52b080da-a3a3-f42e-670f-e9f59237be1a" [ 1147.308356] env[65918]: _type = "Task" [ 1147.308356] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1147.316200] env[65918]: DEBUG oslo_vmware.api [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52b080da-a3a3-f42e-670f-e9f59237be1a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1147.340791] env[65918]: DEBUG nova.network.neutron [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1147.428847] env[65918]: DEBUG nova.network.neutron [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Updating instance_info_cache with network_info: [{"id": "c1dc3d30-74bf-4ad2-987e-f5a8e1b75667", "address": "fa:16:3e:96:74:6b", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.249", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc1dc3d30-74", "ovs_interfaceid": "c1dc3d30-74bf-4ad2-987e-f5a8e1b75667", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1147.439347] env[65918]: DEBUG oslo_concurrency.lockutils [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Releasing lock "refresh_cache-81fef129-8f9a-4a19-afc0-f27411c36159" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1147.439661] env[65918]: DEBUG nova.compute.manager [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Instance network_info: |[{"id": "c1dc3d30-74bf-4ad2-987e-f5a8e1b75667", "address": "fa:16:3e:96:74:6b", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.249", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc1dc3d30-74", "ovs_interfaceid": "c1dc3d30-74bf-4ad2-987e-f5a8e1b75667", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=65918) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1147.440043] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:96:74:6b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c0d5204b-f60e-4830-84c8-2fe246c28202', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c1dc3d30-74bf-4ad2-987e-f5a8e1b75667', 'vif_model': 'vmxnet3'}] {{(pid=65918) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1147.448094] env[65918]: DEBUG oslo.service.loopingcall [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1147.448590] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Creating VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1147.448806] env[65918]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-02de5e48-0782-4153-9111-b2c1d7fe13fc {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.463653] env[65918]: DEBUG nova.network.neutron [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1147.470388] env[65918]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1147.470388] env[65918]: value = "task-2848242" [ 1147.470388] env[65918]: _type = "Task" [ 1147.470388] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1147.473907] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Releasing lock "refresh_cache-46e1dfe1-df73-430c-85ef-f5753974eed0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1147.474251] env[65918]: DEBUG nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1147.474441] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1147.475629] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db7d393f-f2e6-494c-b7ee-339491e162fa {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.481318] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848242, 'name': CreateVM_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1147.485219] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1147.485592] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f9744196-c303-4b4c-a005-f81e7b70e49f {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.512477] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1147.512694] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1147.512869] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Deleting the datastore file [datastore1] 46e1dfe1-df73-430c-85ef-f5753974eed0 {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1147.513132] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7e889536-594c-4945-ade7-15cc26c054c7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.521457] env[65918]: DEBUG oslo_vmware.api [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Waiting for the task: (returnval){ [ 1147.521457] env[65918]: value = "task-2848244" [ 1147.521457] env[65918]: _type = "Task" [ 1147.521457] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1147.529427] env[65918]: DEBUG oslo_vmware.api [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Task: {'id': task-2848244, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1147.820533] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1147.820791] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Creating directory with path [datastore1] vmware_temp/018fa867-e106-4694-8a24-66f57d49ffd6/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1147.821032] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1837d74d-01a3-4cdf-939e-443416b16bb4 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.832771] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Created directory with path [datastore1] vmware_temp/018fa867-e106-4694-8a24-66f57d49ffd6/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1147.832972] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Fetch image to [datastore1] vmware_temp/018fa867-e106-4694-8a24-66f57d49ffd6/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1147.833156] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/018fa867-e106-4694-8a24-66f57d49ffd6/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1147.833887] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb91b0c1-3a15-42aa-af4d-ceb1c4c6e7e6 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.842717] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-116d172e-db84-4db3-93b4-e030b07c1420 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.855023] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47553875-e36b-40da-b1ab-ef76223e22a8 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.888476] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ca757af-35ee-4bda-b519-35a182240603 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.894416] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a9e8f4ac-b0ad-4bb9-9bf1-506856d57ec7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.918252] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1147.981089] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848242, 'name': CreateVM_Task} progress is 99%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1148.031598] env[65918]: DEBUG oslo_vmware.api [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Task: {'id': task-2848244, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.044952} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1148.031745] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1148.031922] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1148.032106] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1148.032278] env[65918]: INFO nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1148.033444] env[65918]: DEBUG oslo.service.loopingcall [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=65918) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1148.033444] env[65918]: DEBUG nova.compute.manager [-] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1148.033444] env[65918]: DEBUG nova.network.neutron [-] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1148.039070] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1148.039823] env[65918]: ERROR nova.compute.manager [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1148.039823] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] Traceback (most recent call last): [ 1148.039823] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1148.039823] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1148.039823] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1148.039823] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] result = getattr(controller, method)(*args, **kwargs) [ 1148.039823] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1148.039823] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self._get(image_id) [ 1148.039823] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1148.039823] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1148.039823] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1148.040160] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] resp, body = self.http_client.get(url, headers=header) [ 1148.040160] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1148.040160] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self.request(url, 'GET', **kwargs) [ 1148.040160] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1148.040160] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self._handle_response(resp) [ 1148.040160] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1148.040160] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] raise exc.from_response(resp, resp.content) [ 1148.040160] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1148.040160] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.040160] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] During handling of the above exception, another exception occurred: [ 1148.040160] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.040160] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] Traceback (most recent call last): [ 1148.040552] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1148.040552] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] yield resources [ 1148.040552] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1148.040552] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] self.driver.spawn(context, instance, image_meta, [ 1148.040552] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1148.040552] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1148.040552] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1148.040552] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] self._fetch_image_if_missing(context, vi) [ 1148.040552] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1148.040552] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] image_fetch(context, vi, tmp_image_ds_loc) [ 1148.040552] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1148.040552] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] images.fetch_image( [ 1148.040552] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1148.040949] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] metadata = IMAGE_API.get(context, image_ref) [ 1148.040949] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1148.040949] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return session.show(context, image_id, [ 1148.040949] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1148.040949] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] _reraise_translated_image_exception(image_id) [ 1148.040949] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1148.040949] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] raise new_exc.with_traceback(exc_trace) [ 1148.040949] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1148.040949] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1148.040949] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1148.040949] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] result = getattr(controller, method)(*args, **kwargs) [ 1148.040949] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1148.040949] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self._get(image_id) [ 1148.041352] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1148.041352] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1148.041352] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1148.041352] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] resp, body = self.http_client.get(url, headers=header) [ 1148.041352] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1148.041352] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self.request(url, 'GET', **kwargs) [ 1148.041352] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1148.041352] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self._handle_response(resp) [ 1148.041352] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1148.041352] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] raise exc.from_response(resp, resp.content) [ 1148.041352] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1148.041352] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.041641] env[65918]: INFO nova.compute.manager [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Terminating instance [ 1148.041897] env[65918]: DEBUG oslo_concurrency.lockutils [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1148.041897] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1148.042520] env[65918]: DEBUG nova.compute.manager [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Start destroying the instance on the hypervisor. {{(pid=65918) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1148.042711] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Destroying instance {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1148.042929] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-edda5f41-2c33-49b6-b755-6de540f02f04 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.045888] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dd51aa7-3936-415b-84f9-4a622683cd49 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.053336] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Unregistering the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1148.053581] env[65918]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-56779056-8a11-4097-9aa7-364ae200ea11 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.056372] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1148.056558] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=65918) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1148.059112] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-59e37bde-dca1-40ca-b1ca-9cc988d4f285 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.066019] env[65918]: DEBUG oslo_vmware.api [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Waiting for the task: (returnval){ [ 1148.066019] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]5217a7d8-c05c-34b7-17d3-d747d662d182" [ 1148.066019] env[65918]: _type = "Task" [ 1148.066019] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1148.072598] env[65918]: DEBUG oslo_vmware.api [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]5217a7d8-c05c-34b7-17d3-d747d662d182, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1148.121605] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Unregistered the VM {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1148.121882] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Deleting contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1148.122128] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Deleting the datastore file [datastore1] c9932955-3b82-4c30-9441-b33695340ed2 {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1148.122531] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-87b1f0da-3367-4fb8-acd1-93d614ef2356 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.130819] env[65918]: DEBUG oslo_vmware.api [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Waiting for the task: (returnval){ [ 1148.130819] env[65918]: value = "task-2848246" [ 1148.130819] env[65918]: _type = "Task" [ 1148.130819] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1148.135686] env[65918]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65918) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1148.136076] env[65918]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-1cc3f49f-374d-42cf-96b7-47d73f5e45b5'] [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1148.137101] env[65918]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py", line 407, in _func [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3041, in _deallocate_network_with_retries [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1148.137611] env[65918]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1148.138100] env[65918]: ERROR oslo.service.loopingcall [ 1148.138589] env[65918]: ERROR nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1148.147719] env[65918]: DEBUG oslo_vmware.api [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Task: {'id': task-2848246, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1148.163403] env[65918]: DEBUG nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Instance has been destroyed from under us while trying to set it to ERROR {{(pid=65918) _set_instance_obj_error_state /opt/stack/nova/nova/compute/manager.py:728}} [ 1148.163717] env[65918]: WARNING nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Could not clean up failed build, not rescheduling. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1148.163930] env[65918]: DEBUG nova.compute.claims [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1148.164130] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1148.164363] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1148.190757] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.191482] env[65918]: DEBUG nova.compute.utils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Instance 46e1dfe1-df73-430c-85ef-f5753974eed0 could not be found. {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1148.193408] env[65918]: DEBUG nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Instance disappeared during build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1148.194038] env[65918]: DEBUG nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1148.194038] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Acquiring lock "refresh_cache-46e1dfe1-df73-430c-85ef-f5753974eed0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1148.194160] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Acquired lock "refresh_cache-46e1dfe1-df73-430c-85ef-f5753974eed0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1148.194238] env[65918]: DEBUG nova.network.neutron [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Building network info cache for instance {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1148.201712] env[65918]: DEBUG nova.compute.utils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Can not refresh info_cache because instance was not found {{(pid=65918) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1148.219861] env[65918]: DEBUG nova.network.neutron [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Instance cache missing network info. {{(pid=65918) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1148.288153] env[65918]: DEBUG nova.network.neutron [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Updating instance_info_cache with network_info: [] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1148.299463] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Releasing lock "refresh_cache-46e1dfe1-df73-430c-85ef-f5753974eed0" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1148.299463] env[65918]: DEBUG nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1148.299463] env[65918]: DEBUG nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1148.299463] env[65918]: DEBUG nova.network.neutron [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1148.414101] env[65918]: DEBUG neutronclient.v2_0.client [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65918) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1148.414101] env[65918]: ERROR nova.compute.manager [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1148.414101] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Traceback (most recent call last): [ 1148.414101] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.414101] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] ret = obj(*args, **kwargs) [ 1148.414101] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1148.414101] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] exception_handler_v20(status_code, error_body) [ 1148.414101] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1148.414101] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] raise client_exc(message=error_message, [ 1148.414406] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1148.414406] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Neutron server returns request_ids: ['req-1cc3f49f-374d-42cf-96b7-47d73f5e45b5'] [ 1148.414406] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.414406] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] During handling of the above exception, another exception occurred: [ 1148.414406] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.414406] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Traceback (most recent call last): [ 1148.414406] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/manager.py", line 2881, in _build_resources [ 1148.414406] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self._shutdown_instance(context, instance, [ 1148.414406] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/manager.py", line 3140, in _shutdown_instance [ 1148.414406] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self._try_deallocate_network(context, instance, requested_networks) [ 1148.414406] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/manager.py", line 3054, in _try_deallocate_network [ 1148.414406] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] with excutils.save_and_reraise_exception(): [ 1148.414406] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1148.414797] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self.force_reraise() [ 1148.414797] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1148.414797] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] raise self.value [ 1148.414797] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/manager.py", line 3052, in _try_deallocate_network [ 1148.414797] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] _deallocate_network_with_retries() [ 1148.414797] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py", line 436, in func [ 1148.414797] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return evt.wait() [ 1148.414797] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1148.414797] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] result = hub.switch() [ 1148.414797] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1148.414797] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return self.greenlet.switch() [ 1148.414797] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1148.414797] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] result = func(*self.args, **self.kw) [ 1148.415129] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py", line 407, in _func [ 1148.415129] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] result = f(*args, **kwargs) [ 1148.415129] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/manager.py", line 3041, in _deallocate_network_with_retries [ 1148.415129] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self._deallocate_network( [ 1148.415129] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1148.415129] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self.network_api.deallocate_for_instance( [ 1148.415129] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1148.415129] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] data = neutron.list_ports(**search_opts) [ 1148.415129] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.415129] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] ret = obj(*args, **kwargs) [ 1148.415129] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1148.415129] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return self.list('ports', self.ports_path, retrieve_all, [ 1148.415129] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.415435] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] ret = obj(*args, **kwargs) [ 1148.415435] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1148.415435] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] for r in self._pagination(collection, path, **params): [ 1148.415435] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1148.415435] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] res = self.get(path, params=params) [ 1148.415435] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.415435] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] ret = obj(*args, **kwargs) [ 1148.415435] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1148.415435] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return self.retry_request("GET", action, body=body, [ 1148.415435] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.415435] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] ret = obj(*args, **kwargs) [ 1148.415435] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1148.415435] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return self.do_request(method, action, body=body, [ 1148.415746] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.415746] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] ret = obj(*args, **kwargs) [ 1148.415746] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1148.415746] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self._handle_fault_response(status_code, replybody, resp) [ 1148.415746] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1148.415746] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1148.415746] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1148.415746] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.415746] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] During handling of the above exception, another exception occurred: [ 1148.415746] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.415746] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Traceback (most recent call last): [ 1148.415746] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/manager.py", line 2594, in _build_and_run_instance [ 1148.415746] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] with self._build_resources(context, instance, [ 1148.416090] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__ [ 1148.416090] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self.gen.throw(typ, value, traceback) [ 1148.416090] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/manager.py", line 2889, in _build_resources [ 1148.416090] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] raise exception.BuildAbortException( [ 1148.416090] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] nova.exception.BuildAbortException: Build of instance 46e1dfe1-df73-430c-85ef-f5753974eed0 aborted: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1148.416090] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.416090] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] During handling of the above exception, another exception occurred: [ 1148.416090] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.416090] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Traceback (most recent call last): [ 1148.416090] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1148.416090] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self._build_and_run_instance(context, instance, image, [ 1148.416090] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1148.416090] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] with excutils.save_and_reraise_exception(): [ 1148.416420] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1148.416420] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self.force_reraise() [ 1148.416420] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1148.416420] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] raise self.value [ 1148.416420] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1148.416420] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] with self.rt.instance_claim(context, instance, node, allocs, [ 1148.416420] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1148.416420] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self.abort() [ 1148.416420] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1148.416420] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1148.416420] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1148.416420] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return f(*args, **kwargs) [ 1148.416420] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1148.416732] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self._unset_instance_host_and_node(instance) [ 1148.416732] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1148.416732] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] instance.save() [ 1148.416732] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1148.416732] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] updates, result = self.indirection_api.object_action( [ 1148.416732] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1148.416732] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return cctxt.call(context, 'object_action', objinst=objinst, [ 1148.416732] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1148.416732] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] result = self.transport._send( [ 1148.416732] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1148.416732] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return self._driver.send(target, ctxt, message, [ 1148.416732] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1148.417032] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1148.417032] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1148.417032] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] raise result [ 1148.417032] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] nova.exception_Remote.InstanceNotFound_Remote: Instance 46e1dfe1-df73-430c-85ef-f5753974eed0 could not be found. [ 1148.417032] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Traceback (most recent call last): [ 1148.417032] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.417032] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1148.417032] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return getattr(target, method)(*args, **kwargs) [ 1148.417032] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.417032] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1148.417032] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return fn(self, *args, **kwargs) [ 1148.417032] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.417032] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1148.417032] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] old_ref, inst_ref = db.instance_update_and_get_original( [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return f(*args, **kwargs) [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] with excutils.save_and_reraise_exception() as ectxt: [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self.force_reraise() [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] raise self.value [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return f(*args, **kwargs) [ 1148.417374] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.417740] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1148.417740] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return f(context, *args, **kwargs) [ 1148.417740] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.417740] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1148.417740] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1148.417740] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.417740] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1148.417740] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] raise exception.InstanceNotFound(instance_id=uuid) [ 1148.417740] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.417740] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] nova.exception.InstanceNotFound: Instance 46e1dfe1-df73-430c-85ef-f5753974eed0 could not be found. [ 1148.417740] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.417740] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.417740] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] During handling of the above exception, another exception occurred: [ 1148.417740] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.417740] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Traceback (most recent call last): [ 1148.418100] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.418100] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] ret = obj(*args, **kwargs) [ 1148.418100] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1148.418100] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] exception_handler_v20(status_code, error_body) [ 1148.418100] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1148.418100] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] raise client_exc(message=error_message, [ 1148.418100] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1148.418100] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Neutron server returns request_ids: ['req-c6c9b538-b211-4ff5-8ead-33adfa85ddfa'] [ 1148.418100] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.418100] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] During handling of the above exception, another exception occurred: [ 1148.418100] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.418100] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Traceback (most recent call last): [ 1148.418100] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1148.418426] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self._deallocate_network(context, instance, requested_networks) [ 1148.418426] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1148.418426] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self.network_api.deallocate_for_instance( [ 1148.418426] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1148.418426] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] data = neutron.list_ports(**search_opts) [ 1148.418426] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.418426] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] ret = obj(*args, **kwargs) [ 1148.418426] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1148.418426] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return self.list('ports', self.ports_path, retrieve_all, [ 1148.418426] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.418426] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] ret = obj(*args, **kwargs) [ 1148.418426] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1148.418426] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] for r in self._pagination(collection, path, **params): [ 1148.418731] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1148.418731] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] res = self.get(path, params=params) [ 1148.418731] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.418731] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] ret = obj(*args, **kwargs) [ 1148.418731] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1148.418731] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return self.retry_request("GET", action, body=body, [ 1148.418731] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.418731] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] ret = obj(*args, **kwargs) [ 1148.418731] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1148.418731] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] return self.do_request(method, action, body=body, [ 1148.418731] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.418731] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] ret = obj(*args, **kwargs) [ 1148.418731] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1148.419035] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] self._handle_fault_response(status_code, replybody, resp) [ 1148.419035] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1148.419035] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] raise exception.Unauthorized() [ 1148.419035] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] nova.exception.Unauthorized: Not authorized. [ 1148.419035] env[65918]: ERROR nova.compute.manager [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] [ 1148.439273] env[65918]: DEBUG oslo_concurrency.lockutils [None req-11906651-cc41-48f4-bbb5-a4cbbf2a76c2 tempest-ServersNegativeTestJSON-1921025626 tempest-ServersNegativeTestJSON-1921025626-project-member] Lock "46e1dfe1-df73-430c-85ef-f5753974eed0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 433.099s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.481370] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848242, 'name': CreateVM_Task} progress is 99%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1148.574346] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Preparing fetch location {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1148.574675] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Creating directory with path [datastore1] vmware_temp/405bce2f-749d-4c29-8f2b-619aabf32322/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1148.574832] env[65918]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-135218b7-8e8e-43a9-86d2-aeeca7bc5bea {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.586920] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Created directory with path [datastore1] vmware_temp/405bce2f-749d-4c29-8f2b-619aabf32322/e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1148.588090] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Fetch image to [datastore1] vmware_temp/405bce2f-749d-4c29-8f2b-619aabf32322/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1148.588090] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to [datastore1] vmware_temp/405bce2f-749d-4c29-8f2b-619aabf32322/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk on the data store datastore1 {{(pid=65918) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1148.588090] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c67ed92-c6a2-4bf8-9b24-3092dbe9ddd0 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.594982] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66f30e54-a389-45a1-9c7c-411f4b737b1e {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.606077] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61a4c973-551f-450f-a5de-c020c90c335c {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.641381] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dfbec22-1655-4690-808b-90167fecbbf8 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.650562] env[65918]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-85bc7ffa-78c7-4e6c-9c49-689c371a48de {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.652443] env[65918]: DEBUG oslo_vmware.api [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Task: {'id': task-2848246, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07012} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1148.652679] env[65918]: DEBUG nova.virt.vmwareapi.ds_util [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Deleted the datastore file {{(pid=65918) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1148.652895] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Deleted contents of the VM from datastore datastore1 {{(pid=65918) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1148.653132] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Instance destroyed {{(pid=65918) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1148.653274] env[65918]: INFO nova.compute.manager [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1148.655336] env[65918]: DEBUG nova.compute.claims [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Aborting claim: {{(pid=65918) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1148.655528] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1148.655733] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1148.674019] env[65918]: DEBUG nova.virt.vmwareapi.images [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Downloading image file data e017c336-3a02-4b58-874a-44a1d1e154fd to the data store datastore1 {{(pid=65918) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1148.679988] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.680489] env[65918]: DEBUG nova.compute.utils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Instance c9932955-3b82-4c30-9441-b33695340ed2 could not be found. {{(pid=65918) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1148.682143] env[65918]: DEBUG nova.compute.manager [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Instance disappeared during build. {{(pid=65918) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1148.682320] env[65918]: DEBUG nova.compute.manager [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Unplugging VIFs for instance {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1148.683865] env[65918]: DEBUG nova.compute.manager [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=65918) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1148.683865] env[65918]: DEBUG nova.compute.manager [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Deallocating network for instance {{(pid=65918) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1148.683865] env[65918]: DEBUG nova.network.neutron [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] deallocate_for_instance() {{(pid=65918) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1148.730772] env[65918]: DEBUG oslo_vmware.rw_handles [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/405bce2f-749d-4c29-8f2b-619aabf32322/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1148.791134] env[65918]: DEBUG oslo_vmware.rw_handles [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Completed reading data from the image iterator. {{(pid=65918) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1148.791362] env[65918]: DEBUG oslo_vmware.rw_handles [None req-00dc0c87-025b-4028-9b2c-a081508706fc tempest-ServerMetadataNegativeTestJSON-1306515900 tempest-ServerMetadataNegativeTestJSON-1306515900-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/405bce2f-749d-4c29-8f2b-619aabf32322/e017c336-3a02-4b58-874a-44a1d1e154fd/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=65918) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1148.841278] env[65918]: DEBUG neutronclient.v2_0.client [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=65918) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1148.842913] env[65918]: ERROR nova.compute.manager [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1148.842913] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] Traceback (most recent call last): [ 1148.842913] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1148.842913] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1148.842913] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1148.842913] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] result = getattr(controller, method)(*args, **kwargs) [ 1148.842913] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1148.842913] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self._get(image_id) [ 1148.842913] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1148.842913] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1148.842913] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1148.842913] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] resp, body = self.http_client.get(url, headers=header) [ 1148.843307] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1148.843307] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self.request(url, 'GET', **kwargs) [ 1148.843307] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1148.843307] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self._handle_response(resp) [ 1148.843307] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1148.843307] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] raise exc.from_response(resp, resp.content) [ 1148.843307] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1148.843307] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.843307] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] During handling of the above exception, another exception occurred: [ 1148.843307] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.843307] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] Traceback (most recent call last): [ 1148.843307] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1148.843809] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] self.driver.spawn(context, instance, image_meta, [ 1148.843809] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1148.843809] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1148.843809] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1148.843809] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] self._fetch_image_if_missing(context, vi) [ 1148.843809] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1148.843809] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] image_fetch(context, vi, tmp_image_ds_loc) [ 1148.843809] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1148.843809] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] images.fetch_image( [ 1148.843809] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1148.843809] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] metadata = IMAGE_API.get(context, image_ref) [ 1148.843809] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1148.843809] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return session.show(context, image_id, [ 1148.844184] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1148.844184] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] _reraise_translated_image_exception(image_id) [ 1148.844184] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1148.844184] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] raise new_exc.with_traceback(exc_trace) [ 1148.844184] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1148.844184] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1148.844184] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1148.844184] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] result = getattr(controller, method)(*args, **kwargs) [ 1148.844184] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1148.844184] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self._get(image_id) [ 1148.844184] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1148.844184] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1148.844184] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1148.844536] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] resp, body = self.http_client.get(url, headers=header) [ 1148.844536] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1148.844536] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self.request(url, 'GET', **kwargs) [ 1148.844536] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1148.844536] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self._handle_response(resp) [ 1148.844536] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1148.844536] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] raise exc.from_response(resp, resp.content) [ 1148.844536] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] nova.exception.ImageNotAuthorized: Not authorized for image e017c336-3a02-4b58-874a-44a1d1e154fd. [ 1148.844536] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.844536] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] During handling of the above exception, another exception occurred: [ 1148.844536] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.844536] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] Traceback (most recent call last): [ 1148.844536] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1148.844934] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] self._build_and_run_instance(context, instance, image, [ 1148.844934] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1148.844934] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] with excutils.save_and_reraise_exception(): [ 1148.844934] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1148.844934] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] self.force_reraise() [ 1148.844934] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1148.844934] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] raise self.value [ 1148.844934] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1148.844934] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] with self.rt.instance_claim(context, instance, node, allocs, [ 1148.844934] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1148.844934] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] self.abort() [ 1148.844934] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1148.844934] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1148.845333] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1148.845333] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return f(*args, **kwargs) [ 1148.845333] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1148.845333] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] self._unset_instance_host_and_node(instance) [ 1148.845333] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1148.845333] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] instance.save() [ 1148.845333] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1148.845333] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] updates, result = self.indirection_api.object_action( [ 1148.845333] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1148.845333] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return cctxt.call(context, 'object_action', objinst=objinst, [ 1148.845333] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1148.845333] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] result = self.transport._send( [ 1148.845723] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1148.845723] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self._driver.send(target, ctxt, message, [ 1148.845723] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1148.845723] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1148.845723] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1148.845723] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] raise result [ 1148.845723] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] nova.exception_Remote.InstanceNotFound_Remote: Instance c9932955-3b82-4c30-9441-b33695340ed2 could not be found. [ 1148.845723] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] Traceback (most recent call last): [ 1148.845723] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.845723] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1148.845723] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return getattr(target, method)(*args, **kwargs) [ 1148.845723] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.845723] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1148.846096] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return fn(self, *args, **kwargs) [ 1148.846096] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.846096] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1148.846096] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] old_ref, inst_ref = db.instance_update_and_get_original( [ 1148.846096] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.846096] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1148.846096] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return f(*args, **kwargs) [ 1148.846096] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.846096] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1148.846096] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] with excutils.save_and_reraise_exception() as ectxt: [ 1148.846096] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.846096] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1148.846096] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] self.force_reraise() [ 1148.846096] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.846096] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1148.846511] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] raise self.value [ 1148.846511] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.846511] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1148.846511] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return f(*args, **kwargs) [ 1148.846511] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.846511] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1148.846511] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return f(context, *args, **kwargs) [ 1148.846511] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.846511] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1148.846511] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1148.846511] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.846511] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1148.846511] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] raise exception.InstanceNotFound(instance_id=uuid) [ 1148.846511] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.846511] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] nova.exception.InstanceNotFound: Instance c9932955-3b82-4c30-9441-b33695340ed2 could not be found. [ 1148.846941] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.846941] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.846941] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] During handling of the above exception, another exception occurred: [ 1148.846941] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.846941] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] Traceback (most recent call last): [ 1148.846941] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.846941] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] ret = obj(*args, **kwargs) [ 1148.846941] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1148.846941] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] exception_handler_v20(status_code, error_body) [ 1148.846941] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1148.846941] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] raise client_exc(message=error_message, [ 1148.846941] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1148.846941] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] Neutron server returns request_ids: ['req-536ce1d4-a908-49a4-9c7f-14b0d2956577'] [ 1148.846941] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.847422] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] During handling of the above exception, another exception occurred: [ 1148.847422] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.847422] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] Traceback (most recent call last): [ 1148.847422] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1148.847422] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] self._deallocate_network(context, instance, requested_networks) [ 1148.847422] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1148.847422] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] self.network_api.deallocate_for_instance( [ 1148.847422] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1148.847422] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] data = neutron.list_ports(**search_opts) [ 1148.847422] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.847422] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] ret = obj(*args, **kwargs) [ 1148.847422] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1148.847422] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self.list('ports', self.ports_path, retrieve_all, [ 1148.847794] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.847794] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] ret = obj(*args, **kwargs) [ 1148.847794] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1148.847794] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] for r in self._pagination(collection, path, **params): [ 1148.847794] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1148.847794] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] res = self.get(path, params=params) [ 1148.847794] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.847794] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] ret = obj(*args, **kwargs) [ 1148.847794] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1148.847794] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self.retry_request("GET", action, body=body, [ 1148.847794] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.847794] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] ret = obj(*args, **kwargs) [ 1148.847794] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1148.848209] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] return self.do_request(method, action, body=body, [ 1148.848209] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1148.848209] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] ret = obj(*args, **kwargs) [ 1148.848209] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1148.848209] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] self._handle_fault_response(status_code, replybody, resp) [ 1148.848209] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1148.848209] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] raise exception.Unauthorized() [ 1148.848209] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] nova.exception.Unauthorized: Not authorized. [ 1148.848209] env[65918]: ERROR nova.compute.manager [instance: c9932955-3b82-4c30-9441-b33695340ed2] [ 1148.870642] env[65918]: DEBUG oslo_concurrency.lockutils [None req-f5a153d1-118a-41bb-9e2a-b0a1cc53e152 tempest-ServerRescueTestJSON-764146281 tempest-ServerRescueTestJSON-764146281-project-member] Lock "c9932955-3b82-4c30-9441-b33695340ed2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 326.188s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.982861] env[65918]: DEBUG oslo_vmware.api [-] Task: {'id': task-2848242, 'name': CreateVM_Task, 'duration_secs': 1.289397} completed successfully. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1148.983043] env[65918]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Created VM on the ESX host {{(pid=65918) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1148.983693] env[65918]: DEBUG oslo_concurrency.lockutils [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1148.983851] env[65918]: DEBUG oslo_concurrency.lockutils [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquired lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1148.984176] env[65918]: DEBUG oslo_concurrency.lockutils [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1148.984410] env[65918]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-87e2a5de-cc4f-406c-bb7e-d25b7bac594d {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.988761] env[65918]: DEBUG oslo_vmware.api [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Waiting for the task: (returnval){ [ 1148.988761] env[65918]: value = "session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52248f85-d78e-47ff-046a-8fa40c9002e7" [ 1148.988761] env[65918]: _type = "Task" [ 1148.988761] env[65918]: } to complete. {{(pid=65918) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1148.997177] env[65918]: DEBUG oslo_vmware.api [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Task: {'id': session[523f5ec2-7667-1e2c-bfa3-3051980fd847]52248f85-d78e-47ff-046a-8fa40c9002e7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=65918) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1149.100364] env[65918]: DEBUG nova.compute.manager [req-bbcb6af0-f006-49ce-8062-49822119d19f req-cd70f8f5-cee2-4289-a47d-e9f22b27ef09 service nova] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Received event network-changed-c1dc3d30-74bf-4ad2-987e-f5a8e1b75667 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1149.101838] env[65918]: DEBUG nova.compute.manager [req-bbcb6af0-f006-49ce-8062-49822119d19f req-cd70f8f5-cee2-4289-a47d-e9f22b27ef09 service nova] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Refreshing instance network info cache due to event network-changed-c1dc3d30-74bf-4ad2-987e-f5a8e1b75667. {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1149.101838] env[65918]: DEBUG oslo_concurrency.lockutils [req-bbcb6af0-f006-49ce-8062-49822119d19f req-cd70f8f5-cee2-4289-a47d-e9f22b27ef09 service nova] Acquiring lock "refresh_cache-81fef129-8f9a-4a19-afc0-f27411c36159" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1149.101838] env[65918]: DEBUG oslo_concurrency.lockutils [req-bbcb6af0-f006-49ce-8062-49822119d19f req-cd70f8f5-cee2-4289-a47d-e9f22b27ef09 service nova] Acquired lock "refresh_cache-81fef129-8f9a-4a19-afc0-f27411c36159" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1149.101838] env[65918]: DEBUG nova.network.neutron [req-bbcb6af0-f006-49ce-8062-49822119d19f req-cd70f8f5-cee2-4289-a47d-e9f22b27ef09 service nova] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Refreshing network info cache for port c1dc3d30-74bf-4ad2-987e-f5a8e1b75667 {{(pid=65918) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1149.499971] env[65918]: DEBUG oslo_concurrency.lockutils [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Releasing lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1149.500287] env[65918]: DEBUG nova.virt.vmwareapi.vmops [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Processing image e017c336-3a02-4b58-874a-44a1d1e154fd {{(pid=65918) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1149.500559] env[65918]: DEBUG oslo_concurrency.lockutils [None req-09d594fa-82ce-4c0d-b5cc-e32fe0de1bde tempest-DeleteServersAdminTestJSON-1412865218 tempest-DeleteServersAdminTestJSON-1412865218-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/e017c336-3a02-4b58-874a-44a1d1e154fd/e017c336-3a02-4b58-874a-44a1d1e154fd.vmdk" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1149.550882] env[65918]: DEBUG nova.network.neutron [req-bbcb6af0-f006-49ce-8062-49822119d19f req-cd70f8f5-cee2-4289-a47d-e9f22b27ef09 service nova] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Updated VIF entry in instance network info cache for port c1dc3d30-74bf-4ad2-987e-f5a8e1b75667. {{(pid=65918) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1149.551264] env[65918]: DEBUG nova.network.neutron [req-bbcb6af0-f006-49ce-8062-49822119d19f req-cd70f8f5-cee2-4289-a47d-e9f22b27ef09 service nova] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Updating instance_info_cache with network_info: [{"id": "c1dc3d30-74bf-4ad2-987e-f5a8e1b75667", "address": "fa:16:3e:96:74:6b", "network": {"id": "6f8eb533-989d-48a0-b84e-5549e9b4efc9", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.249", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a06639ef48fd43768e273db76d6c8f54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc1dc3d30-74", "ovs_interfaceid": "c1dc3d30-74bf-4ad2-987e-f5a8e1b75667", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=65918) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1149.561401] env[65918]: DEBUG oslo_concurrency.lockutils [req-bbcb6af0-f006-49ce-8062-49822119d19f req-cd70f8f5-cee2-4289-a47d-e9f22b27ef09 service nova] Releasing lock "refresh_cache-81fef129-8f9a-4a19-afc0-f27411c36159" {{(pid=65918) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1151.051564] env[65918]: DEBUG nova.compute.manager [req-ddbd29b1-862a-4a15-8094-f8ed34f9923e req-de875e39-7e51-451c-9073-0a38269240d7 service nova] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Received event network-vif-deleted-c1dc3d30-74bf-4ad2-987e-f5a8e1b75667 {{(pid=65918) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1176.421058] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1176.423496] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1177.424265] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1177.424710] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=65918) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1178.424221] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1178.424468] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1179.430361] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1179.430727] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Starting heal instance info cache {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1179.430727] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Rebuilding the list of instances to heal {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1179.439561] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Didn't find any instances for network info cache update. {{(pid=65918) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1179.439776] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager.update_available_resource {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1179.448751] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1179.448950] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1179.449131] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1179.449286] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=65918) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1179.450381] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e12166f8-e6ca-4298-85b9-a942ed862198 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1179.458884] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd722dbb-eead-482e-b90c-338dd5ca2cff {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1179.473322] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57f08c16-1b48-4f15-81d7-4c773c34efe0 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1179.479303] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11415d54-9733-4400-abfd-cca62b7a15a5 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1179.507389] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181050MB free_disk=168GB free_vcpus=48 pci_devices=None {{(pid=65918) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1179.507579] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1179.507728] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1179.541483] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1179.541664] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=65918) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1179.556860] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9a949b4-a624-4877-bf3f-cc8ad042857b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1179.564802] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-beb914c2-5e95-4789-be2a-254b6ca596d7 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1179.595615] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20a1423f-dffc-44f7-8606-6e72d5ca1c00 {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1179.602957] env[65918]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fb64926-3dfd-4788-a7de-eaf09887735b {{(pid=65918) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1179.616052] env[65918]: DEBUG nova.compute.provider_tree [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed in ProviderTree for provider: 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 {{(pid=65918) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1179.624010] env[65918]: DEBUG nova.scheduler.client.report [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Inventory has not changed for provider 0bcf3fd3-93ee-4c0a-abed-95169e714cc4 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 168, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=65918) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1179.637100] env[65918]: DEBUG nova.compute.resource_tracker [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=65918) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1179.637272] env[65918]: DEBUG oslo_concurrency.lockutils [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s {{(pid=65918) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1179.637482] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1179.637620] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Cleaning up deleted instances with incomplete migration {{(pid=65918) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 1180.629024] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1181.423969] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1181.424239] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1182.419433] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1184.423478] env[65918]: DEBUG oslo_service.periodic_task [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=65918) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1184.423853] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] Cleaning up deleted instances {{(pid=65918) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 1184.454295] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] There are 11 instances to clean {{(pid=65918) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 1184.454573] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 81fef129-8f9a-4a19-afc0-f27411c36159] Instance has had 0 of 5 cleanup attempts {{(pid=65918) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1184.488427] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: a0f06a58-65d2-4325-8f93-0948b4e5ac8c] Instance has had 0 of 5 cleanup attempts {{(pid=65918) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1184.523588] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 8017964a-7fe8-40eb-a79d-47e0401a27d1] Instance has had 0 of 5 cleanup attempts {{(pid=65918) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1184.548768] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 3faaaacf-815e-4493-81a7-2a32f868442a] Instance has had 0 of 5 cleanup attempts {{(pid=65918) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1184.568828] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: c9932955-3b82-4c30-9441-b33695340ed2] Instance has had 0 of 5 cleanup attempts {{(pid=65918) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1184.588210] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 46e1dfe1-df73-430c-85ef-f5753974eed0] Instance has had 0 of 5 cleanup attempts {{(pid=65918) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1184.605915] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 78576ca1-7755-4532-82ee-de46c9d3a1fc] Instance has had 0 of 5 cleanup attempts {{(pid=65918) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1184.624828] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: c04e5253-0275-4fb3-8eca-6a395c95930f] Instance has had 0 of 5 cleanup attempts {{(pid=65918) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1184.642482] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: a4d6dc5b-f4a4-4e45-9e4a-187bbddeb398] Instance has had 0 of 5 cleanup attempts {{(pid=65918) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1184.659546] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 0ccebca0-a1a4-48b2-9154-1c73350dab38] Instance has had 0 of 5 cleanup attempts {{(pid=65918) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1184.676935] env[65918]: DEBUG nova.compute.manager [None req-32ffbafa-5870-4871-bce4-645109ad039d None None] [instance: 3b3f8c10-5ba5-445c-a51d-5404874df3d9] Instance has had 0 of 5 cleanup attempts {{(pid=65918) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}}